title | intro | versions | type | topics | shortTitle | redirect_from | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Migrating from Bitbucket Pipelines with GitHub Actions Importer |
Learn how to use {% data variables.product.prodname_actions_importer %} to automate the migration of your Bitbucket pipelines to {% data variables.product.prodname_actions %}. |
|
tutorial |
|
Bitbucket Pipelines migration |
|
The instructions below will guide you through configuring your environment to use {% data variables.product.prodname_actions_importer %} to migrate Bitbucket Pipelines to {% data variables.product.prodname_actions %}.
{% data reusables.actions.actions-importer-prerequisites %}
There are some limitations when migrating from Bitbucket Pipelines to {% data variables.product.prodname_actions %} with {% data variables.product.prodname_actions_importer %}.
-
Images in a private AWS ECR are not supported.
-
The Bitbucket Pipelines option
size
is not supported. {% ifversion fpt or ghec %}If additional runner resources are required in {% data variables.product.prodname_actions %}, consider using {% data variables.actions.hosted_runner %}s. For more information, see "AUTOTITLE."{% endif %} -
Metrics detailing the queue time of jobs is not supported by the
forecast
command. -
Bitbucket after-scripts are supported using {% data variables.product.prodname_actions %}
always()
in combination with checking thesteps.<step_id>.conclusion
of the previous step. For more information, see "AUTOTITLE."The following is an example of using the
always()
withsteps.<step_id>.conclusion
.- name: After Script 1 run: |- echo "I'm after the script ran!" echo "We should be grouped!" id: after-script-1 if: "{% raw %}${{ always() }}{% endraw %}" - name: After Script 2 run: |- echo "this is really the end" echo "goodbye, for now!" id: after-script-2 if: "{% raw %}${{ steps.after-script-1.conclusion == 'success' && always() }}{% endraw %}"
Certain Bitbucket Pipelines constructs must be migrated manually. These include:
- Secured repository, workspace, and deployment variables
- SSH keys
{% data reusables.actions.installing-actions-importer %}
The configure
CLI command is used to set required credentials and options for {% data variables.product.prodname_actions_importer %} when working with Bitbucket Pipelines and {% data variables.product.prodname_dotcom %}.
-
Create a {% data variables.product.prodname_dotcom %} {% data variables.product.pat_v1 %}. For more information, see "AUTOTITLE."
Your token must have the
workflow
scope.After creating the token, copy it and save it in a safe location for later use.
-
Create a Workspace Access Token for Bitbucket Pipelines. For more information, see Workspace Access Token permissions in the Bitbucket documentation. Your token must have the
read
scope for pipelines, projects, and repositories. -
In your terminal, run the {% data variables.product.prodname_actions_importer %}
configure
CLI command:gh actions-importer configure
The
configure
command will prompt you for the following information:- For "Which CI providers are you configuring?", use the arrow keys to select
Bitbucket
, press Space to select it, then press Enter. - For "{% data variables.product.pat_generic_caps %} for GitHub", enter the value of the {% data variables.product.pat_v1 %} that you created earlier, and press Enter.
- For "Base url of the GitHub instance", {% ifversion ghes %}enter the URL for your {% data variables.product.product_name %} instance, and press Enter.{% else %}press Enter to accept the default value (
https://github.com
).{% endif %} - For "{% data variables.product.pat_generic_caps %} for Bitbucket", enter the Workspace Access Token that you created earlier, and press Enter.
- For "Base url of the Bitbucket instance", enter the URL for your Bitbucket instance, and press Enter.
An example of the
configure
command is shown below:$ gh actions-importer configure ✔ Which CI providers are you configuring?: Bitbucket Enter the following values (leave empty to omit): ✔ {% data variables.product.pat_generic_caps %} for GitHub: *************** ✔ Base url of the GitHub instance: https://github.com ✔ {% data variables.product.pat_generic_caps %} for Bitbucket: ******************** ✔ Base url of the Bitbucket instance: https://bitbucket.example.com Environment variables successfully updated.
- For "Which CI providers are you configuring?", use the arrow keys to select
-
In your terminal, run the {% data variables.product.prodname_actions_importer %}
update
CLI command to connect to {% data variables.product.prodname_registry %} {% data variables.product.prodname_container_registry %} and ensure that the container image is updated to the latest version:gh actions-importer update
The output of the command should be similar to below:
Updating ghcr.io/actions-importer/cli:latest... ghcr.io/actions-importer/cli:latest up-to-date
You can use the audit command to get a high-level view of pipelines in a Bitbucket instance.
The audit command performs the following steps.
- Fetches all of the pipelines for a workspace.
- Converts pipeline to its equivalent GitHub Actions workflow.
- Generates a report that summarizes how complete and complex of a migration is possible with {% data variables.product.prodname_actions_importer %}.
To perform an audit run the following command in your terminal, replacing :workspace
with the name of the Bitbucket workspace to audit.
gh actions-importer audit bitbucket --workspace :workspace --output-dir tmp/audit
Optionally, a --project-key
option can be provided to the audit command to limit the results to only pipelines associated with a project.
In the below example command :project_key
should be replaced with the key of the project that should be audited. Project keys can be found in Bitbucket on the workspace projects page.
gh actions-importer audit bitbucket --workspace :workspace --project-key :project_key --output-dir tmp/audit
{% data reusables.actions.gai-inspect-audit %}
You can use the forecast
command to forecast potential {% data variables.product.prodname_actions %} usage by computing metrics from completed pipeline runs in your Bitbucket instance.
To perform a forecast of potential GitHub Actions usage, run the following command in your terminal, replacing :workspace
with the name of the Bitbucket workspace to forecast. By default, GitHub Actions Importer includes the previous seven days in the forecast report.
gh actions-importer forecast bitbucket --workspace :workspace --output-dir tmp/forecast_reports
To limit the forecast to a project, you can use the --project-key
option. Replace the value for the :project_key
with the project key for the project to forecast.
gh actions-importer forecast bitbucket --workspace :workspace --project-key :project_key --output-dir tmp/forecast_reports
The forecast_report.md
file in the specified output directory contains the results of the forecast.
Listed below are some key terms that can appear in the forecast report:
- The job count is the total number of completed jobs.
- The pipeline count is the number of unique pipelines used.
- Execution time describes the amount of time a runner spent on a job. This metric can be used to help plan for the cost of {% data variables.product.prodname_dotcom %}-hosted runners.
- This metric is correlated to how much you should expect to spend in {% data variables.product.prodname_actions %}. This will vary depending on the hardware used for these minutes. You can use the {% data variables.product.prodname_actions %} pricing calculator to estimate the costs.
- Concurrent jobs metrics describe the amount of jobs running at any given time.
You can use the dry-run command to convert a Bitbucket pipeline to an equivalent {% data variables.product.prodname_actions %} workflow(s). A dry-run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline.
To perform a dry run of migrating a Bitbucket pipeline to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing :workspace
with the name of the workspace and :repo
with the name of the repository in Bitbucket.
gh actions-importer dry-run bitbucket --workspace :workspace --repository :repo --output-dir tmp/dry-run
You can view the logs of the dry run and the converted workflow files in the specified output directory.
{% data reusables.actions.gai-custom-transformers-rec %}
You can use the migrate command to convert a Bitbucket pipeline and open a pull request with the equivalent {% data variables.product.prodname_actions %} workflow(s).
To migrate a Bitbucket pipeline to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing the following values.
- Replace
target-url
value with the URL for your {% data variables.product.company_short %} repository. - Replace
:repo
with the name of the repository in Bitbucket. - Replace
:workspace
with the name of the workspace.
gh actions-importer migrate bitbucket --workspace :workspace --repository :repo --target-url https://github.com/:owner/:repo --output-dir tmp/dry-run
The command's output includes the URL of the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following:
gh actions-importer migrate bitbucket --workspace actions-importer --repository custom-trigger --target-url https://github.com/valet-dev-testing/demo-private --output-dir tmp/bitbucket
[2023-07-18 09:56:06] Logs: 'tmp/bitbucket/log/valet-20230718-165606.log'
[2023-07-18 09:56:24] Pull request: 'https://github.com/valet-dev-testing/demo-private/pull/55'
{% data reusables.actions.gai-inspect-pull-request %}
This section contains reference information on environment variables, optional arguments, and supported syntax when using {% data variables.product.prodname_actions_importer %} to migrate from Bitbucket Pipelines.
{% data reusables.actions.gai-config-environment-variables %}
{% data variables.product.prodname_actions_importer %} uses the following environment variables to connect to your Bitbucket instance.
GITHUB_ACCESS_TOKEN
: The {% data variables.product.pat_v1 %} used to create pull requests with a transformed workflow (requiresrepo
andworkflow
scopes).GITHUB_INSTANCE_URL
: The url to the target GitHub instance. (e.g.https://github.com
)BITBUCKET_ACCESS_TOKEN
: The workspace access token with read scopes for pipeline, project, and repository.
These environment variables can be specified in a .env.local
file that will be loaded by {% data variables.product.prodname_actions_importer %} at run time. The distribution archive contains a .env.local.template
file that can be used to create these files.
{% data reusables.actions.gai-optional-arguments-intro %}
You can use the --source-file-path
argument with the dry-run
or migrate
subcommands.
By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from the Bitbucket instance. The --source-file-path
argument tells {% data variables.product.prodname_actions_importer %} to use the specified source file path instead.
For example:
gh actions-importer dry-run bitbucket --workspace :workspace --repository :repo --output-dir tmp/dry-run --source-file-path path/to/my/pipeline/file.yml
You can use the --config-file-path
argument with the audit
, dry-run
, and migrate
subcommands.
By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from the Bitbucket instance. The --config-file-path
argument tells {% data variables.product.prodname_actions_importer %} to use the specified source files instead.
In this example, {% data variables.product.prodname_actions_importer %} uses the specified YAML configuration file to perform an audit.
gh actions-importer audit bitbucket --workspace :workspace --output-dir tmp/audit --config-file-path "path/to/my/bitbucket/config.yml"
To audit a Bitbucket instance using a config file, the config file must be in the following format, and each repository_slug
must be unique:
source_files:
- repository_slug: repo_name
path: path/to/one/source/file.yml
- repository_slug: another_repo_name
path: path/to/another/source/file.yml
The following table shows the type of properties that {% data variables.product.prodname_actions_importer %} is currently able to convert.
Bitbucket | GitHub Actions | Status |
---|---|---|
after-script |
jobs.<job_id>.steps[*] |
Supported |
artifacts |
actions/upload-artifact & download-artifact |
Supported |
caches |
actions/cache |
Supported |
clone |
actions/checkout |
Supported |
condition |
job.<job_id>.steps[*].run |
Supported |
deployment |
jobs.<job_id>.environment |
Supported |
image |
jobs.<job_id>.container |
Supported |
max-time |
jobs.<job_id>.steps[*].timeout-minutes |
Supported |
options.docker |
None | Supported |
options.max-time |
jobs.<job_id>.steps[*].timeout-minutes |
Supported |
parallel |
jobs.<job_id> |
Supported |
pipelines.branches |
on.push |
Supported |
pipelines.custom |
on.workflow_dispatch |
Supported |
pipelines.default |
on.push |
Supported |
pipelines.pull-requests |
on.pull_requests |
Supported |
pipelines.tags |
on.tags |
Supported |
runs-on |
jobs.<job_id>.runs-on |
Supported |
script |
job.<job_id>.steps[*].run |
Supported |
services |
jobs.<job_id>.service |
Supported |
stage |
jobs.<job_id> |
Supported |
step |
jobs.<job_id>.steps[*] |
Supported |
trigger |
on.workflow_dispatch |
Supported |
fail-fast |
None | Unsupported |
oidc |
None | Unsupported |
options.size |
None | Unsupported |
size |
None | Unsupported |
{% data variables.product.prodname_actions_importer %} uses the mapping in the table below to convert default Bitbucket environment variables to the closest equivalent in {% data variables.product.prodname_actions %}.
Bitbucket | GitHub Actions |
---|---|
CI |
{% raw %}true {% endraw %} |
BITBUCKET_BUILD_NUMBER |
{% raw %}${{ github.run_number }} {% endraw %} |
BITBUCKET_CLONE_DIR |
{% raw %}${{ github.workspace }} {% endraw %} |
BITBUCKET_COMMIT |
{% raw %}${{ github.sha }} {% endraw %} |
BITBUCKET_WORKSPACE |
{% raw %}${{ github.repository_owner }} {% endraw %} |
BITBUCKET_REPO_SLUG |
{% raw %}${{ github.repository }} {% endraw %} |
BITBUCKET_REPO_UUID |
{% raw %}${{ github.repository_id }} {% endraw %} |
BITBUCKET_REPO_FULL_NAME |
{% raw %}${{ github.repository_owner }} {% endraw %}/{% raw %}${{ github.repository }} {% endraw %} |
BITBUCKET_BRANCH |
{% raw %}${{ github.ref }} {% endraw %} |
BITBUCKET_TAG |
{% raw %}${{ github.ref }} {% endraw %} |
BITBUCKET_PR_ID |
{% raw %}${{ github.event.pull_request.number }} {% endraw %} |
BITBUCKET_PR_DESTINATION_BRANCH |
{% raw %}${{ github.event.pull_request.base.ref }} {% endraw %} |
BITBUCKET_GIT_HTTP_ORIGIN |
{% raw %}${{ github.event.repository.clone_url }} {% endraw %} |
BITBUCKET_GIT_SSH_ORIGIN |
{% raw %}${{ github.event.repository.ssh_url }} {% endraw %} |
BITBUCKET_EXIT_CODE |
{% raw %}${{ job.status }} {% endraw %} |
BITBUCKET_STEP_UUID |
{% raw %}${{ job.github_job }} {% endraw %} |
BITBUCKET_PIPELINE_UUID |
{% raw %}${{ github.workflow }} {% endraw %} |
BITBUCKET_PROJECT_KEY |
{% raw %}${{ github.repository_owner }} {% endraw %} |
BITBUCKET_PROJECT_UUID |
{% raw %}${{ github.repository_owner }} {% endraw %} |
BITBUCKET_STEP_TRIGGERER_UUID |
{% raw %}${{ github.actor_id }} {% endraw %} |
BITBUCKET_SSH_KEY_FILE |
{% raw %}${{ github.workspace }}/.ssh/id_rsa {% endraw %} |
BITBUCKET_STEP_OIDC_TOKEN |
No Mapping |
BITBUCKET_DEPLOYMENT_ENVIRONMENT |
No Mapping |
BITBUCKET_DEPLOYMENT_ENVIRONMENT_UUID |
No Mapping |
BITBUCKET_BOOKMARK |
No Mapping |
BITBUCKET_PARALLEL_STEP |
No Mapping |
BITBUCKET_PARALLEL_STEP_COUNT |
No Mapping |
System variables used in tasks are transformed to the equivalent bash shell variable and are assumed to be available. For example, ${system.<variable.name>}
will be transformed to $variable_name
. We recommend you verify this to ensure proper operation of the workflow.
{% data reusables.actions.actions-importer-legal-notice %}