This page describes known issues in Pipelines. For issues with Cloud Platform CD, see Known issues in Cloud Platform CD.
When a tag is pushed to a Git provider like GitHub or Bitbucket, the associated pipeline job executes. However, the resulting build artifact is pushed as a branch. For example, instead of using the expected naming convention: tags/pipelines-build-<tag_name>, the artifact is associated with the branch name: pipelines-build-<tag_name>. This issue applies only to newly created tags.
As a Multi-Experience Operations (MEO) customer, if you use the pipelines-deploy command in your codebase to create, deploy, or delete a Cloud Platform CD environment, a 5XX server-side error occurs. This error occurs because CD environments are not supported for codebases.
Workaround:
Remove or comment out the pipelines-deploy command in the acquia-pipelines.yaml file.
Deploy your built code artifacts through the Cloud Platform user interface:
Navigate to your codebase in the Cloud Platform user interface.
Go to the Environments tab.
Locate the environment that you want to update.
Select the kebab menu next to your environment.
Select Switch code.
Choose the desired Git branch and proceed with the deployment.
If you receive a Fatal system error message when attempting to run a Pipelines job, extended Unicode character is present in one of the following:
After you upgrade Pipelines to MySQL 8.0, you may encounter collation errors when your Cloud Classic and Site Factory infrastructure run on MySQL 5.7. This causes database import and synchronization commands like drush sql-dump to fail. This occurs because of an incompatibility between the default collation settings in MySQL 8.0 and MySQL 5.7.
Workaround: To resolve this issue, you can modify your acquia-pipelines.yaml file to include the following line:
mysqldump "${SRC_CONN[@]}" -qcK --set-gtid-purged=OFF --single-transaction --no-tablespaces --no-create-db --max_allowed_packet=100M "$SRC_DBNAME" |However, you might still encounter the Can't sql:dump with MySQL 5.7.41 known issue with the drush sql-dump command in MySQL 5.7. To resolve this issue, do one of the following:
drush sql-sync command instead of drush sql-dump.--no-tablespaces flag to your drush sql-dump command.By default, Playwright installs several browsers. WebKit, one of the default browsers, is not supported by Cloud IDE containers. However, Cloud IDE containers provide the system dependencies necessary to run Firefox and Google Chrome.
Workaround: You can install a supported browser such as Google Chrome or Firefox by running one of the following commands:
npx playwright install firefox
npx playwright install chromeWhen you enable FIPS in Pipelines, certain applications and methods might break. For example, Ruby might fail with the following error:
FIPS mode is enabled, bundler can't use the CompactIndex API
Workaround: Modify your application to use only FIPS-validated cryptographic methods and avoid using methods like md5 that are prohibited in FIPS. For more information about unsupported methods, see unsupported methods.
If a new Cloud Platform environment is provisioned by a CI/CD process, such as Acquia Pipelines or Acquia Code Studio, the new environment is provisioned with the default Cloud Platform PHP version. The new environment does not inherit the PHP settings from the source environment or CI/CD build. For more information on the default PHP version, see default Cloud Platform PHP version.
Workaround: You must set your desired PHP version for the new environment. You can configure the PHP version through the Cloud Platform user interface or the command line.
acli app:task-wait "$(acli api:environments:update $TARGET_ENV_ID --lang_version=8.1)"On Cloud Next, when Pipelines is configured to use Acquia Git, jobs are not automatically triggered. To execute your job, you must click Start Job. However, if you create a CDE on a Cloud Next application, jobs are triggered for pushes to the branch that is deployed to the CDE.
Workaround: Configure Pipelines to use GitHub or Bitbucket.
To ensure that you are always in sync with Cloud Platform, Node.js 18 and 20 are preinstalled in the Pipelines container. You cannot use nvm to install different versions of Node.js 18 and 20 in the Pipelines container.
If you try to install a different version, your build output displays the following error:
Executing step hello.
+ echo "Hello, Pipelines!" > docroot/index.html
+ nvm install 18.12.0
Downloading and installing node v18.12.0...
Downloading https://nodejs.org/dist/v18.12.0/node-v18.12.0-linux-x64.tar.xz...
########################################
56.3%
########################################################################
100.0%
Computing checksum with sha256sum
Checksums matched!
Now using node v18.12.0 (npm v)
Creating default alias: default -> 18.12.0 (-> v18.12.0 *)
+ node -v
node: /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.27' not found (required by node)
node: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.25' not found (required by node)
node: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.28' not found (required by node)
+ node -vIn your build file, ensure that you install a major version such as 18 or 20. Do not install a specific version such as 20.7.0.
version: 1.0.0
events:
build:
steps:
- js:
type: script
script:
- cd docroot/modules/custom/my_custom_module
- node --version
- nvm use 20
- node --version
- npm install
- npm run build:devUsers authenticated into cloud.acquia.com using custom identity provider integration receive a 403 error when attempting to authenticate into the Cloud Platform Pipelines UI feature and the Pipelines client/CLI.
If you are experiencing issues with Bitbucket connections, see the important notes at Connecting Pipelines to your Bitbucket repo.
The Bitbucket permissions model requires the current user to own the repository that exists in the personal default workspace of the user, and all of the code being deployed. Repositories cannot be part of a custom workspace.
If permissions are not configured correctly, users might be unable to add branches from external pull requests or experience other issues.
Pipelines only works with Bitbucket Cloud, and not Bitbucket infrastructure.
The API for Bitbucket does not allow you to pull in the branch from a fork inside a pull request.
You must add a user as a collaborator to your fork to see the user’s branches.
Bitbucket has an optional setting to automatically delete pull request branches after pull requests are merged. This will cause the pr-merged event in Pipelines to fail with an error:
Failed to parse the build file. The VCS path [feature/branch] was not found
in git repository.Bitbucket uses the username and email address associated with a Git commit to associate the commit with a Bitbucket user account. If no user is associated with a commit, Pipelines will fail with an error:
webhook start failed to start pipeline for app [app]. Operation failed with
the following details: Payload do not have the following required
information.Workaround: Ensure the username and email attached to the commit is associated with a Bitbucket user account.
OpenSSH version 7.8 now generates RSA key pairs with a default format incompatible with Pipelines. Pipelines jobs that include a SSH key generated in the default format by these versions of OpenSSH in the ssh-keys section of the build definition file will fail with the following error:
Failed to parse the build file. The SSH key named [KEY NAME] is not a valid
SSH private key or requires a password.Workaround: Generate a key using a format compatible with Pipelines jobs:
ssh-keygen -m PEM -t rsa -b 4096Pipelines allows you to link repositories to the service to trigger new Pipelines jobs using webhooks.
If you’ve linked Pipelines to a GitHub repository and later revoke the GitHub personal access token used by Pipelines, trying to change the linked Pipelines repository will result in an error: “Please ensure your personal access token is valid. Failed to remove webhook.”
Workaround: Use the Pipeline CLI to reconnect to GitHub, or reconnect GitHub by directly visiting the linking page at the following URL, replacing my-application-id with your application ID in:
https://cloud.acquia.com/a/applications/my-app-id/pipelines/github
If a Pipelines job terminates because of a timeout or excessive disk space usage, you cannot access its historical job log. To access the log, run the job again and monitor the streaming log. When streaming the log output through the Pipeline CLI logs command, you can preserve the output by redirecting it to a file.
Manually starting jobs only works with branches. Tags are not supported. To manually start a job, ensure that you do it from a branch instead.
When both branches and tags do not trigger your Pipelines jobs automatically, then it might indicate that you need to reset your Pipelines credentials.
If your YAML build definition invokes RVM (Ruby Version Manager), you will run into errors. To work around the issue, you need to switch your build to using rbenv instead. Rbenv is a lightweight and more developer-friendly option to switch between Ruby versions.
To continue using different versions of Ruby, replace any instances of RVM in your YAML build definition with equivalent rbenv commands.
rvm install x.x.x becomes rbenv install x.x.x.rvm use x.x.x becomes rbenv global x.x.x or rbenv local x.x.x depending on the desired behavior.If this content did not answer your questions, try searching or contacting our support team for further assistance.
Users authenticated into cloud.acquia.com using custom identity provider integration receive a 403 error when attempting to authenticate into the Cloud Platform Pipelines UI feature and the Pipelines client/CLI.
If you are experiencing issues with Bitbucket connections, see the important notes at Connecting Pipelines to your Bitbucket repo.
The Bitbucket permissions model requires the current user to own the repository that exists in the personal default workspace of the user, and all of the code being deployed. Repositories cannot be part of a custom workspace.
If permissions are not configured correctly, users might be unable to add branches from external pull requests or experience other issues.
Pipelines only works with Bitbucket Cloud, and not Bitbucket infrastructure.
The API for Bitbucket does not allow you to pull in the branch from a fork inside a pull request.
You must add a user as a collaborator to your fork to see the user’s branches.
Bitbucket has an optional setting to automatically delete pull request branches after pull requests are merged. This will cause the pr-merged event in Pipelines to fail with an error:
Failed to parse the build file. The VCS path [feature/branch] was not found
in git repository.Bitbucket uses the username and email address associated with a Git commit to associate the commit with a Bitbucket user account. If no user is associated with a commit, Pipelines will fail with an error:
webhook start failed to start pipeline for app [app]. Operation failed with
the following details: Payload do not have the following required
information.Workaround: Ensure the username and email attached to the commit is associated with a Bitbucket user account.
OpenSSH version 7.8 now generates RSA key pairs with a default format incompatible with Pipelines. Pipelines jobs that include a SSH key generated in the default format by these versions of OpenSSH in the ssh-keys section of the build definition file will fail with the following error:
Failed to parse the build file. The SSH key named [KEY NAME] is not a valid
SSH private key or requires a password.Workaround: Generate a key using a format compatible with Pipelines jobs:
ssh-keygen -m PEM -t rsa -b 4096Pipelines allows you to link repositories to the service to trigger new Pipelines jobs using webhooks.
If you’ve linked Pipelines to a GitHub repository and later revoke the GitHub personal access token used by Pipelines, trying to change the linked Pipelines repository will result in an error: “Please ensure your personal access token is valid. Failed to remove webhook.”
Workaround: Use the Pipeline CLI to reconnect to GitHub, or reconnect GitHub by directly visiting the linking page at the following URL, replacing my-application-id with your application ID in:
https://cloud.acquia.com/a/applications/my-app-id/pipelines/github
If a Pipelines job terminates because of a timeout or excessive disk space usage, you cannot access its historical job log. To access the log, run the job again and monitor the streaming log. When streaming the log output through the Pipeline CLI logs command, you can preserve the output by redirecting it to a file.
Manually starting jobs only works with branches. Tags are not supported. To manually start a job, ensure that you do it from a branch instead.
When both branches and tags do not trigger your Pipelines jobs automatically, then it might indicate that you need to reset your Pipelines credentials.
If your YAML build definition invokes RVM (Ruby Version Manager), you will run into errors. To work around the issue, you need to switch your build to using rbenv instead. Rbenv is a lightweight and more developer-friendly option to switch between Ruby versions.
To continue using different versions of Ruby, replace any instances of RVM in your YAML build definition with equivalent rbenv commands.
rvm install x.x.x becomes rbenv install x.x.x.rvm use x.x.x becomes rbenv global x.x.x or rbenv local x.x.x depending on the desired behavior.If this content did not answer your questions, try searching or contacting our support team for further assistance.