Skip to main content
Pure Technical Services

Intersight Cloud Orchestrator - Component Level Details of Workflows

Currently viewing public documentation. Please login to access the full scope of documentation.

Component-level details of workflows

Now that we have seen an overview of workflows, we will dive deeper into the components used to make workflows useful for end-users from the perspective of building a custom workflow.

One thing to keep in mind is that any user building workflows should approach operations from a mindset of achieving idempotence, meaning that we can apply the same configuration via executing a workflow within ICO, and we should achieve the same result every time this configuration is applied.

Let's dig into these categories of the inner workings of Intersight Cloud Orchestrator:

  • Data Models & Data Types
  • Device mapping
  • User input
  • Variables
  • Failure/terminal action
  • Executors

Data Models & Data Types

Data models are at the core of what makes Intersight both powerful and flexible for managing and integrating not only Cisco devices, but also third-party integrations and custom extensibility by end users.

Data models, which are built into Intersight, allow for any specific type of data object or physical device being managed by Cisco Intersight to have their own individual structure, including properties (attributes) and methods (actions that can be taken against the object/device).

These devices are described within Intersight through the data model built by the vendor, which allows for any specific device type to be developed so that it can include any functionality accessible via API to be exposed within Intersight, whether these are Cisco or third-party vendor devices.

However, users of Intersight Cloud Orchestrator can extend the platform even further using custom data types, which users define.  If no other data type such as a string or integer meets the needs for a workflow, these custom data types allow for defining a new type object within Intersight.

These custom data types can be simple or complex objects that can be used at workflow inputs, they can leverage any combination of one or more string/integer/float/boolean/enum/JSON inputs, and they can have specific constraints and validations assigned.

Custom data types can be defined and used against any API, whether an Intersight claimed device, any public API accessible from Intersight, or a private/on-prem API that is accessible via an Intersight Assist appliance.

Custom data types used within tasks or workflows can also be exported from Intersight, so that they can be used by anyone else who imports that workflow.

 

Device Mapping

Device mapping is a very beneficial concept and construct we can leverage within Intersight Cloud Orchestrator. The ability to specify devices of a particular type allows a workflow creator to limit the choices available for a user to target within a workflow.

By ensuring that a device of a particular type is selected, as well as having all of the available attributes & metadata of the device itself to be used as potential input, it becomes much simpler for workflow creators to structure their workflows.  They can also give a user experience that can be dynamic with specific options only shown if they apply to the particular type of device being targeted with the workflow.

This ability to map a specific type of device, and to understand all of the attributes & metadata about a particular device type will drive some of the specific usage that we highlight next as user input.

 

User Input

When creating a workflow, we start with deciding what steps we are trying to perform, and then we create the lists of tasks required to be run to complete all of these steps.  This full list of tasks will determine the information necessary for completing these steps, and then we must decide how much of this information we will require the end-user to enter.

To understand why we may want to limit the options available to someone running a workflow, we should consider how workflows may be used operationally within a customer organization.  By hardcoding certain values for tasks, we can specify which environments a workflow can be run against or enforce that specific values or device selections are used to meet deployment standards.

Based upon the Organization where a workflow is placed, we can also restrict who can run a workflow and which devices it may be run against, as their access is determined by the org and resource group to which their account has been granted privileges.  To clarify, an Intersight Org is a logical entity that supports multi-tenancy within an account through a grouping of resources.

 

Simple User Input

At the basic level, workflow user input includes the detail of task-level inputs which were laid out in the "Breakdown of ICO Tasks" section.  Most of this is around being able to allow a user to provide input as a string and being able to evaluate those with RegEx, enter an integer and be able to ensure it is between a specific minimum & maximum range, choose from a preset list of Enum options, or use an object selector to choose a device of a specific type that exists within Intersight.

 

Advanced User Input

Beyond those simple input options, there is another set of functionalities that can take those simple inputs and drive a dynamic workflow experience for the user.

One of the most powerful functions that fall into advanced user input is the ability to define parameter sets or progressive disclosure rules.  These rules will control the workflow inputs that are available or displayed, based on values of earlier workflow selections.  To gain a better understanding of how to use these, here are some examples:

Parameter Set rule

A parameter set will explicitly control which input fields are displayed by an earlier workflow selection.

As an example, let us assume that we give the user an option to provision a volume for a single host versus hostgroup.

Based on their selection, we will ask the user to input a size in GiB for a boot volume (single host volume), or to input a size in TiB for a datastore volume (hostgroup volume).

(NOTE: Enum is essentially a specific preset value, or Enum list is a specific list of preset values).

Enum—Type is Enum and Enum list is 'Host Volume' and 'HostGroup Volume'.

BootVolumeSize—Type is String.

DatastoreVolumeSize—Type is String.

HostGroupName—Type is String.

 

We would then create two parameter set rules from the workflow input:

Rule 1 ("NoHostGroupOptions"): During the workflow, if the Enum value is 'Host Volume', then only the 'BootVolumeSize' field is available to choose the size in GiB for a boot volume (single host volume).

Rule 2 ("ShowHostGroupOptions"): During the workflow, if the Enum value is 'Hostgroup Volume', then only the 'DatastoreVolumeSize' field is available to choose the size in GiB for a datastore volume (hostgroup volume), along with the 'HostGroupName' field to input the name of the hostgroup where the volume will be connected.

Within the code for the Intersight API request, we can see how this would look to give an example of how these inputs are available or hidden from the user:

"InputParameterSet": [
    {
      "Condition": "eq",
      "ControlParameter": "Enum",
      "EnableParameters": [
        "BootVolumeSize"
      ],
      "Name": "NoHostGroupOptions",
      "ObjectType": "workflow.ParameterSet",
      "Value": "Host Volume"
    },
    {
      "Condition": "eq",
      "ControlParameter": "Enum",
      "EnableParameters": [
        " DatastoreVolumeSize",
        " HostGroupName"
      ],
      "Name": "ShowHostGroupOptions",
      "ObjectType": "workflow.ParameterSet",
      "Value": "Hostgroup Volume"
    },
]

 

The data types which can be used for Parameter Set rules are Boolean, Enum, String, Object Selector, MoReference, and Target.

 

Progressive Disclosure rule

A progressive disclosure rule will instead filter the data available in an input field based upon an earlier selection within a workflow.  The initial input field will have the most options available, but later input fields will have limited options based on the initial input.

For example, let us assume that a user is creating a new storage host to be added to a host group.

We start with a Parameter Set rule named "ShowHostGroup" which will display a "Host Group" option to the user, if the Storage Array they choose within the workflow matches a type of "storage.PureArray".

However, we then have a Progressive Disclosure rule that will filter the list of Host Groups that the user can pick.  We do this via these mappings, to be explained after the list:

Input Field Attribute: Ancestors.Moid

Operator: 'Equal to'

Source Input Field: StorageDevice

Source Input Field Attribute: Moid

 

For a more easily understandable explanation, let us walk through that mapping.  We start with the Source Input Field being an object datatype of 'StorageDevice', which means a specific storage array object that is claimed in Intersight and is chosen from a list by the user.

Before our next step of looking for identifiers and comparing them, we want to understand the meaning of a "Moid," which is the Managed object identifier (Moid); this is essentially a GUID/globally unique identifier for the device that was assigned when it was claimed into Intersight. Now that we understand the "Moid" concept, we can explain how our progressive disclosure rules use this information for comparison. 

Our rule will filter the selection list of host groups to only display those where the "Ancestors.Moid", which is the managed object ID of their parent (being the storage array) matches the "Moid" of the specific storage array, which was chosen by the user earlier in the workflow execution.

This is how the user is then shown only the hostgroups that exist on a specific storage array, rather than all host groups from every Pure array known by Intersight.

Within the code for the Intersight API request, we can see how this would look to give an example of how these inputs are available or hidden from the user:

"InputParameterSet": [
    {
      "Condition": "eq",
      "ControlParameter": "StorageDevice.ObjectType",
      "EnableParameters": [
        "HostGroupName"
      ],
      "Name": "ShowHostGroup",
      "ObjectType": "workflow.ParameterSet",
      "Value": "storage.PureArray"
    }
  ],
  "UiInputFilters": [
    {
      "Filters": [
        "Ancestors.Moid eq '${StorageDevice.Moid}'"
      ],
      "Name": "HostGroupName",
      "ObjectType": "workflow.UiInputFilter",
      "UserHelpMessage": "Select 'Storage Device' before selecting 'Host Group'"
    }
]

 

Variables

Workflow variables are similar to local variables used within a function of a script or programming language.  Variables set within a workflow can be used by any task contained within the same workflow.  Workflow tasks are scoped to a specific workflow and can read or update only any variables defined for the workflow.

Variables can either be set to a static value when a specific task is run within the workflow, they can be directly mapped based on user inputs, or dynamically set based on specifics of devices or task outputs.

Variables can also be used to simplify workflows to use a single instance of a variable that is specific to a conditional branch of a workflow, or they can leverage transformations to hold an altered value of some information to adjust every mapped instance of this variable, such as changing the case of a string value.

Variables can also leverage advanced mappings, which allows the creator to map a Golang template to the workflow variable, allowing for some functional programming to be used during a workflow to set the variable value.

 

Failure/termination actions

Whenever an error or failure is encountered within a workflow, there are multiple options for handling the outcome of that result.  The simplest method for dealing with errors during a workflow is to retry tasks from the point of failure, but there are more advanced options as well.

If an error occurs during a workflow, a task or workflow creator also has the ability to leverage rollback actions as well.  Tasks can be programmed to have specific details for how to perform a rollback, which means that the task understands how to undo any action which was performed.

Evaluating for failures is performed at the task level, and each rollback is handled at the individual task level, which allows a workflow to be stopped and rolled back at any specific task if the rollback action has been defined.

The definition of error & rollback at the task level is intended to prevent a workflow from encountering an error and not halting at a good point, not leaving an environment in an unknown or non-functioning state, nor having a workflow partially completed where it cannot be exited and executed again due to partial completion. 

 

Executors

Executors are advanced functionality that allow for extending the automation and orchestration capabilities of ICO to control platforms and systems, which are not natively integrated with Intersight.

NOTE: While functionality of executors has been covered in detail in the 'Breakdown of ICO Tasks' section, it is important to highlight these within this component level workflow overview so they are not overlooked from a workflow perspective.

Executors allow for running scripts within a on-prem customer environment via PowerShell or SSH, along with running IaC configuration within a on-prem customer environment via Ansible, or invoking an API request to either publicly accessible APIs from Intersight, along with private APIs access to an Intersight Virtual Assist appliance.

Executors also allow users to leverage ICO to be the orchestration engine for any of these scripting/API requests that they handle in another fashion, to give customers the ability to quickly utilize ICO to centralize existing orchestration functions and make them accessible to users in a simple cloud-based GUI.

Both Ansible and SSH executors can run both embedded and reusable tasks, which have different use cases and investments in configuration to be used within a workflow. Embedded tasks can leverage either existing automation within a customer environment, which can essentially be something like standalone scripts/playbooks that are injected into an ICO workflow.  Reusable tasks are essentially more generic tasks that can be fed input variables from other workflow tasks, and allow for more flexibility, while also requiring more configuration to handle the injection of external values.

Executors can also leverage a very powerful templating engine that can give much more advanced configuration options which can go far beyond built-in vendor tasks.  The Intersight Help documentation covers the template engine syntax within each specific page for Task Executors.