Pipeline FAQs
This page answers some frequently asked questions about pipelines in Harness. For additional information and questions about pipelines generally, pipeline components (such as delegates, connectors, and secrets), and module-specific pipelines (such as CI/CD pipelines), go to the module and Platform documentation and the other FAQ pages.
General/miscellaneous
How many pipelines can I have?
There is no limit to the number of pipelines you can create in a project.
Is there a character limit for pipeline names?
Pipeline names are limited to 128 characters.
When I try to open a Git-stored pipeline, why doesn't the branch dropdown display all the branches?
This behavior is expected when there are more than 20-30 branches in the repo due to pagination. To select branches that are not listed, try manually entering the full branch name. This should allow you to open the pipeline from that branch.
Pipeline access control
Can I disable a pipeline?
With Harness CD, you can use a Deployment freeze to do this.
How do I fix the error "You are missing the following permission: Create / Edit Pipelines"?
To create or edit pipelines, you need the Create/Edit
or Write
pipelines permission in Harness. Your permissions are determined by your assigned roles and resource groups.
Can I use RBAC to hide pipelines?
Harness RBAC doesn't offer a setting to hide pipelines. However, you can achieve this by creating a role and resource group that has visibility (access) to specific pipelines. Then assign users to the role and resource group accordingly. This way, users can view a limited set of pipelines and execute them only if permitted by the resource group assignments. For more information, go to RBAC in Harness and Resource group scope and refinement
Can I provide access to specific pipelines in Harness?
You can use Harness RBAC to create a resource group that has visibility (access) to specific pipelines.
Can I allow a user to edit existing pipelines but not create new pipelines?
Harness doesn't distinguish the create and edit permissions. To block creation while allowing editing of pipelines, you need to use Harness RBAC to create a role with the Create/Edit
pipeline permission, and define a resource group containing the specific pipelines that you want the user to be able to edit. Then assign the user to that role/resource group accordingly.
By selecting specific, individual pipelines when configuring the resource group, the user can edit the selected existing pipelines only. They can't create new ones. Note that whenever new pipelines are created, you must adjust this resource group to include the new pipelines if you want the user to be able to edit them.
Can I make my pipeline dependent on the RBAC permissions of the user that runs the pipeline?
Harness doesn't have a variable like <+currentuser.role>
that returns the role for user running the pipeline; however, you can use a variable to get the user's email address. For more information, go to the pipeline.triggeredBy.email expression.
You can also set the first step of the pipeline to call the Get aggregated user endpoint, which lists all roles assigned to the user, and then configure a conditional execution that only allows the pipeline to proceed if the roles pass a JEXL condition.
API
Can I run pipelines through the API or CLI?
Yes, you can execute the pipeline through the Pipeline execution API.
You can use the API curl or custom webhook curl in your preferred CLI to execute pipelines.
The Harness CLI doesn't have a built-in command for pipeline execution.
Which RBAC permissions are required to execute pipeline tasks using the API?
The user needs the same permissions to execute pipeline tasks via API as they would need when running the same tasks through the Harness UI.
Can I block API keys used to trigger pipeline execution per environment?
You can use Harness Service Accounts to define granular roles and permissions for what users have access to. This allows you to scope the API keys to specific resources/environments and ensure there is no cross-scope access.
Which API method can I use to invoke a pipeline when using multiple dynamic parameters?
It depends of your scenario. If you use the same set of inputs to invoke a pipeline, Harness recommends that you use the Execute a Pipeline with Input Set References endpoint.
You can refer to an existing input set in the InputSet API method, so you don't need to specify all the parameters each time. For example, if you have a pre-defined input set for staging deployments, you can create an input set called staging-inputset
, as well as others for different environments. Then, you can use the environment_name
to dynamically select the appropriate input set.
If your pipeline has a very specific context of each execution, where you need to pass different parameters on each execution, Harness recommends that you use the Pipeline with Runtime Input YAML endpoint.
Can I use a service account token to approve a pipeline through the API?
Service Account API tokens aren't supported for the Approval API. You must use a personal access token.
Can I use the API to get a failed pipeline's error details?
You can use the getExecutionDetailV2 API to get the executionErrorInfo
if the pipeline's status is failed
.
What is the GraphQL API query to list executions with details between a specific time range?
GraphQL to list execution details within a specific time range
{
executions(filters:[{startTime:{operator:AFTER, value:1643285847000}},{endTime:{operator:BEFORE,value:1656332247000}}], limit:30) {
pageInfo {
limit
offset
total
}
nodes {
startedAt
endedAt
tags {
name
value
}
id
application {
id
name
}
status
cause {
... on ExecutedByUser {
user {
email
}
}
... on ExecutedByTrigger {
trigger {
id
name
}
}
}
... on PipelineExecution {
pipeline {
id
name
}
memberExecutions{
nodes{
... on WorkflowExecution{
workflow{
id
name
}
id
artifacts {
buildNo
artifactSource {
name
}
}
outcomes{
nodes{
... on DeploymentOutcome{
service{
id
name
}
environment{
id
name
}
}
}
}
}
}
}
}
... on WorkflowExecution {
workflow {
id
name
}
id
artifacts {
buildNo
artifactSource {
name
}
}
outcomes{
nodes{
... on DeploymentOutcome{
service{
id
name
}
environment{
id
name
}
}
}
}
}
}
}
}
How do I avoid hitting the GitHub API rate limit when using multiple templates and Git-stored pipelines?
To minimize GitHub calls from Harness, enabling the bi-directional Git Experience can significantly reduce the number of requests.
Pipeline triggers
How can I obtain the triggered build version value, trigger ID, or trigger URL during pipeline runtime when a pipeline is triggered by a PR?
Harness provides specific expressions to access information about the trigger during pipeline runtime. For the username associated with the PR changes, you can use <+pipeline.triggeredBy.name>
, which will give you the name of the user who initiated the PR. To get the ID of the trigger that fired the execution, the expression <+pipeline.triggeredBy.triggerIdentifier>
can be utilized. However, it's important to note that Harness does not offer an expression to retrieve the URL to the trigger. Therefore, while you can easily access the username and trigger ID, obtaining the trigger URL directly in the pipeline runtime is not supported.
Can I disable pipeline triggers?
On the Triggers page, switch the Enabled toggle to disable a trigger.
What is the webhook identifier for a pipeline trigger?
On the Triggers list, you can find a trigger's identifier (ID) under the trigger name. You can get the webhook URL by selecting the Link icon in the Webhook column.
What is the recommended time delay before making an API call to a pipeline trigger's apiUrl?
A good average would be 20-30 seconds, but it is impossible to give a precise interval to wait before calling this API.
This API returns successful responses only when pipeline execution kicks in. Between trigger initiation and pipeline start, Harness loads the pipeline YAML and referred templates, which can take some time, especially if the entities are Git-synced and need to be fetched from Git.
To be safe, Harness recommends a polling approach. For example, try calling the API ten seconds after trigger initiation. If this fails, wait another ten seconds and try again, and so on.
Can I create a trigger that starts another pipeline when one pipeline ends?
You can't create a trigger for this, but you can set up Pipeline chaining.