You can migrate individual repositories or all repositories from a BitBucket Server instance using CLI.
At this time, migrating from Bitbucket Server with the API is not supported.
Note
If the repository you are migrating has rulesets that the incoming repository doesn't match, the migration will be blocked. To bypass these rulesets and allow the migration, you can apply a ruleset bypass for all deploy keys in the target organization.
Repository rulesets can be set at the organization level. If the incoming repository does not match any of these rulesets, you will need to use the deploy key bypass for each one. See Creating rulesets for repositories in your organization.
- We strongly recommend that you perform a trial run of your migration and complete your production migration soon after. To learn more about trial runs, see Overview of a migration from Bitbucket Server to Enterprise Cloud.
- Ensure you understand the data that will be migrated and the known support limitations of the Importer. For more information, see About migrations from Bitbucket Server to Enterprise Cloud.
- While not required, we recommend halting work during your production migration. The Importer doesn't support delta migrations, so any changes that happen during the migration will not migrate. If you choose not to halt work during your production migration, you'll need to manually migrate these changes.
- For the destination organization on .com, you must be an organization owner or have the migrator role. For more information, see Managing access for a migration from Bitbucket Server.
- You need the username and password for a Bitbucket Server account with admin or super admin permissions.
If this is your first migration, you'll need to install the BBS2GH extension of the CLI. For more information about CLI, see About CLI.
Alternatively, you can download a standalone binary from the releases page for the /gh-bbs2gh
repository. You can run this binary directly, without the gh
prefix.
Install the CLI. For installation instructions for CLI, see the CLI repository.
Note
You need version 2.4.0 or newer of CLI. You can check the version you have installed with the
gh --version
command.Install the BBS2GH extension.
Shell gh extension install /gh-bbs2gh
gh extension install /gh-bbs2gh
Any time you need help with the BBS2GH extension, you can use the --help
flag with a command. For example, gh bbs2gh --help
will list all the available commands, and gh bbs2gh migrate-repo --help
will list all the options available for the migrate-repo
command.
The BBS2GH extension of the CLI is updated weekly. To make sure you're using the latest version, update the extension.
gh extension upgrade /gh-bbs2gh
gh extension upgrade /gh-bbs2gh
Before you can use the BBS2GH extension to migrate to Enterprise Cloud, you must create a personal access token that can access the destination organization, then set the personal access token as an environment variable.
You'll also need to set environment variables for your Bitbucket Server username and password and, if your Bitbucket Server instance runs on Windows, your SMB password.
Create and record a personal access token (classic) that will authenticate for the destination organization on Enterprise Cloud, making sure that the token meets all requirements. For more information, see Managing access for a migration from Bitbucket Server.
Set environment variables, replacing TOKEN with the personal access token you recorded above, USERNAME with the username of a Bitbucket Server account that has admin or super admin permissions, and PASSWORD with the password for the Bitbucket Server account.
If you're using Terminal, use the
export
command.Shell export GH_PAT="TOKEN" export BBS_USERNAME="USERNAME" export BBS_PASSWORD="PASSWORD" # If your Bitbucket Server instance runs on Windows export SMB_PASSWORD="PASSWORD"
export GH_PAT="TOKEN" export BBS_USERNAME="USERNAME" export BBS_PASSWORD="PASSWORD" # If your Bitbucket Server instance runs on Windows export SMB_PASSWORD="PASSWORD"
If you're using PowerShell, use the
$env
command.Shell $env:GH_PAT="TOKEN" $env:BBS_USERNAME="USERNAME" $env:BBS_PASSWORD="PASSWORD" # If your Bitbucket Server instance runs on Windows $env:SMB_PASSWORD="PASSWORD"
$env:GH_PAT="TOKEN" $env:BBS_USERNAME="USERNAME" $env:BBS_PASSWORD="PASSWORD" # If your Bitbucket Server instance runs on Windows $env:SMB_PASSWORD="PASSWORD"
If you're migrating to Enterprise Cloud with data residency, for convenience, set an environment variable for the base API URL for your enterprise.
If you're using Terminal, use the
export
command.Shell export TARGET_API_URL="https://api.octocorp.ghe.com"
export TARGET_API_URL="https://api.octocorp.ghe.com"
If you're using PowerShell, use the
$env
command.Shell $env:TARGET_API_URL="https://api.octocorp.ghe.com"
$env:TARGET_API_URL="https://api.octocorp.ghe.com"
You'll use this variable with the
--target-api-url
option in commands you run with the CLI.
Because many Bitbucket Server instances sit behind firewalls, the CLI uses blob storage as an intermediate location to store your data that is reachable from the internet.
You will first generate an archive of the data you want to migrate and push the data to blob storage from behind your firewall.
The CLI supports the following blob storage providers:
- Amazon Web Services (AWS) S3
- Azure Blob Storage
Before you can run a migration, you need to set up a storage container with your chosen cloud provider to store your data.
In AWS, set up a S3 bucket. For more information, see Creating a bucket in the AWS documentation.
You will also need an AWS access key and secret key with the following permissions:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:ListBucketMultipartUploads",
"s3:AbortMultipartUpload",
"s3:ListBucket",
"s3:DeleteObject",
"s3:ListMultipartUploadParts"
],
"Resource": [
"arn:aws:s3:::-migration-bucket",
"arn:aws:s3:::-migration-bucket/*"
]
}
]
}
Note
Enterprise Importer does not delete your archive from AWS after your migration is finished. To reduce storage costs, we recommend configuring auto-deletion of your archive after a period of time. For more information, see Setting lifecycle configuration on a bucket in the AWS documentation.
When you're ready to run your migration, you will need to provide your AWS credentials to the CLI: region, access key, secret key, and session token (if required). You can pass them as arguments, or set environment variables called AWS_REGION
, AWS_ACCESS_KEY_ID
, AWS_SECRET_ACCESS_KEY
, and AWS_SESSION_TOKEN
.
You will also need to pass in the name of the S3 bucket using the --aws-bucket-name
argument.
In Azure, create a storage account and make a note of your connection string. For more information, see Manage storage account access keys in Microsoft Docs.
Note
Enterprise Importer does not delete your archive from Azure Blob Storage after your migration is finished. To reduce storage costs, we recommend configuring auto-deletion of your archive after a period of time. For more information, see Optimize costs by automatically managing the data lifecycle in Microsoft Docs.
When you're ready to run your migration, you can either pass your connection string into the CLI as an argument, or pass it in using an environment variable called AZURE_STORAGE_CONNECTION_STRING
.
If you have configured firewall rules on your storage account, ensure you have allowed access to the IP ranges for your migration destination. See Managing access for a migration from Bitbucket Server.
You can migrate repositories with the gh bbs2gh migrate-repo
command.
When you migrate a repository, by default, the BBS2GH extension of the CLI performs the following steps:
- Connects to your Bitbucket Server instance and generates a migration archive per repository
- Downloads the migration archive from the Bitbucket Server instance to the machine where you're running the BBS2GH extension of the CLI, using SFTP (Linux) or SMB (Windows)
- Uploads the migration archives to the blob storage provider of your choice
- Starts your migration in Enterprise Cloud, using the URLs of the archives stored with your blob storage provider
- Deletes the migration archive from your local machine. (You'll need to delete the archive from your blob storage provider manually once the migration has finished.)
Alternatively, you can use the CLI to generate the archive, download that archive manually, and then use the CLI to continue the migration.
To migrate a single repository, use the gh bbs2gh migrate-repo
command.
You must follow this step from a computer that can access:
- Your Bitbucket Server instance via HTTPS
- Your Bitbucket Server instance via SFTP, if your Bitbucket Server instance runs on Linux. In general, if you can access the server via SSH, then you can also use SFTP.
- Your Bitbucket Server instance via SMB, if your Bitbucket Server instance runs on Windows
- Your chosen blob storage provider
gh bbs2gh migrate-repo --bbs-server-url BBS-SERVER-URL \ --bbs-project PROJECT --bbs-repo CURRENT-NAME \ ---org DESTINATION ---repo NEW-NAME \ # If you are migrating to GHE.com: --target-api-url TARGET-API-URL # If your Bitbucket Server instance runs on Linux: --ssh-user SSH-USER --ssh-private-key PATH-TO-KEY # If your Bitbucket Server instance runs on Windows: --smb-user SMB-USER # If you're using AWS S3 as your blob storage provider: --aws-bucket-name AWS-BUCKET-NAME # If you are running a Bitbucket Data Center cluster or your Bitbucket Server is behind a load balancer: --archive-download-host ARCHIVE-DOWNLOAD-HOST
gh bbs2gh migrate-repo --bbs-server-url BBS-SERVER-URL \
--bbs-project PROJECT --bbs-repo CURRENT-NAME \
---org DESTINATION ---repo NEW-NAME \
# If you are migrating to GHE.com:
--target-api-url TARGET-API-URL
# If your Bitbucket Server instance runs on Linux:
--ssh-user SSH-USER --ssh-private-key PATH-TO-KEY
# If your Bitbucket Server instance runs on Windows:
--smb-user SMB-USER
# If you're using AWS S3 as your blob storage provider:
--aws-bucket-name AWS-BUCKET-NAME
# If you are running a Bitbucket Data Center cluster or your Bitbucket Server is behind a load balancer:
--archive-download-host ARCHIVE-DOWNLOAD-HOST
Replace the placeholders in the command above with the following values.
Placeholder | Value |
---|---|
BBS-SERVER-URL | The URL for your Bitbucket Server instance |
PROJECT | The key for the Bitbucket Server project of the repository you want to migrate |
CURRENT-NAME | The name of the repository you want to migrate |
DESTINATION | Name of the destination organization |
NEW-NAME | The name you want the migrated repository to have |
TARGET-API-URL | The base API URL for your enterprise's subdomain of GHE.com. For example: https://api.octocorp.ghe.com . |
SSH-USER | If your Bitbucket Server instance runs on Linux, the username to use when connecting to your Bitbucket Server via SFTP |
PATH-TO-KEY | If your Bitbucket Server instance runs on Linux, the path to your SSH private key, such as ~/.ssh/id_rsa . For SSH key requirements, see Managing access for a migration from Bitbucket Server. |
SMB-USER | If your Bitbucket Server instance runs on Windows, the username to use when connecting to your Bitbucket Server via SMB |
AWS-BUCKET-NAME | The bucket name for your AWS S3 bucket |
ARCHIVE-DOWNLOAD-HOST | The host to use to connect to the Bitbucket Server/Data Center instance via SSH or SMB. You only need to specify this if you are running a Bitbucket Data Center cluster or your Bitbucket Server is behind a load balancer. |
Note
If you get an error mentioning Renci.SshNet
, then the CLI is having issues making an SFTP connection to your server to download your migration archive. For information about how to troubleshoot these issues, see Troubleshooting your migration with Enterprise Importer.
By default, the BBS2GH extension of the CLI performs the entire migration, including downloading the migration archive from the Bitbucket Server instance using SFTP or SMB.
However, some customers prefer to download the migration archive manually, because their server does not offer SFTP access, for example. In that case, you can use the CLI to generate the archive, download that archive manually, and then use the CLI to continue the migration.
You must follow this step from a computer that can access:
- Your Bitbucket Server instance via HTTPS
- Your chosen blob storage provider
First, use the gh bbs2gh migrate-repo
command with only the following arguments:
gh bbs2gh migrate-repo --bbs-server-url BBS-SERVER-URL \ --bbs-project PROJECT \ --bbs-repo CURRENT-NAME
gh bbs2gh migrate-repo --bbs-server-url BBS-SERVER-URL \
--bbs-project PROJECT \
--bbs-repo CURRENT-NAME
Replace the placeholders in the command above with the following values.
Placeholder | Value |
---|---|
BBS-SERVER-URL | The URL for your Bitbucket Server instance |
PROJECT | The key for the Bitbucket Server project of the repository you want to migrate |
CURRENT-NAME | The name of the repository you want to migrate |
Your migration archive will be generated, and its path will be printed in the command output:
[12:14] [INFO] Export completed. Your migration archive should be ready on your
instance at $BITBUCKET_SHARED_HOME/data/migration/export/Bitbucket_export_9.tar
In general, $BITBUCKET_SHARED_HOME
will be set to /var/atlassian/application-data/bitbucket/shared
on Linux and C:\Atlassian\ApplicationData\Bitbucket\Shared
on Windows, but this may differ depending on your server configuration. To help you identify your shared home directory, see Troubleshooting your migration with Enterprise Importer.
Download the migration archive from your Bitbucket Server instance, and store the archive on the machine where you're running the CLI.
To import your migration archive into , use the gh bbs2gh migrate-repo
command again, with a different set of arguments:
gh bbs2gh migrate-repo --archive-path ARCHIVE-PATH \ ---org DESTINATION ---repo NEW-NAME \ --bbs-server-url BBS-SERVER-URL \ --bbs-project PROJECT \ --bbs-repo CURRENT-NAME \ # If you're using AWS S3 as your blob storage provider: --aws-bucket-name AWS-BUCKET-NAME # If you are migrating to GHE.com: --target-api-url TARGET-API-URL
gh bbs2gh migrate-repo --archive-path ARCHIVE-PATH \
---org DESTINATION ---repo NEW-NAME \
--bbs-server-url BBS-SERVER-URL \
--bbs-project PROJECT \
--bbs-repo CURRENT-NAME \
# If you're using AWS S3 as your blob storage provider:
--aws-bucket-name AWS-BUCKET-NAME
# If you are migrating to GHE.com:
--target-api-url TARGET-API-URL
Replace the placeholders in the command above with the following values.
Placeholder | Value |
---|---|
ARCHIVE-PATH | The path to the Bitbucket Server migration archive you downloaded from your instance |
DESTINATION | Name of the destination organization |
NEW-NAME | The name you want the migrated repository to have |
BBS-SERVER-URL | The URL for your Bitbucket Server instance |
PROJECT | The key for the Bitbucket Server project of the repository you want to migrate |
CURRENT-NAME | The name of the repository you want to migrate |
AWS-BUCKET-NAME | The bucket name for your AWS S3 bucket |
TARGET-API-URL | The base API URL for your enterprise's subdomain of GHE.com. For example: https://api.octocorp.ghe.com . |
If you want to cancel a migration, use the abort-migration
command, replacing MIGRATION-ID with the ID returned from migrate-repo
.
gh bbs2gh abort-migration --migration-id MIGRATION-ID
gh bbs2gh abort-migration --migration-id MIGRATION-ID
When your migration is complete, we recommend reviewing your migration log. For more information, see Accessing your migration logs for Enterprise Importer.
We recommend that you review your migrated repositories for a soundness check.
If you want to migrate multiple repositories to Enterprise Cloud at once, use the CLI to generate a migration script. The resulting script will contain a list of migration commands, one per repository.
Note
Generating a script outputs a PowerShell script. If you're using Terminal, you will need to output the script with the .ps1
file extension and install PowerShell for either Mac or Linux to run it.
You must follow this step from a computer that can access your Bitbucket Server instance via HTTPS.
To generate a migration script, run the gh bbs2gh generate-script
command.
gh bbs2gh generate-script --bbs-server-url BBS-SERVER-URL \ ---org DESTINATION \ --output FILENAME \ # If you are migrating to GHE.com: --target-api-url TARGET-API-URL # If your Bitbucket Server instance runs on Linux: --ssh-user SSH-USER --ssh-private-key PATH-TO-KEY # If your Bitbucket Server instance runs on Windows: --smb-user SMB-USER # If you are running a Bitbucket Data Center cluster or your Bitbucket Server is behind a load balancer: --archive-download-host ARCHIVE-DOWNLOAD-HOST
gh bbs2gh generate-script --bbs-server-url BBS-SERVER-URL \
---org DESTINATION \
--output FILENAME \
# If you are migrating to GHE.com:
--target-api-url TARGET-API-URL
# If your Bitbucket Server instance runs on Linux:
--ssh-user SSH-USER --ssh-private-key PATH-TO-KEY
# If your Bitbucket Server instance runs on Windows:
--smb-user SMB-USER
# If you are running a Bitbucket Data Center cluster or your Bitbucket Server is behind a load balancer:
--archive-download-host ARCHIVE-DOWNLOAD-HOST
If you want the script to download the migration log for each migrated repository, add the --download-migration-logs
flag. For more information about migration logs, see Accessing your migration logs for Enterprise Importer.
Replace the placeholders in the command above with the following values.
Placeholder | Value |
---|---|
BBS-SERVER-URL | The URL for your Bitbucket Server instance |
DESTINATION | Name of the destination organization |
FILENAME | A filename for the resulting migration script If you're using Terminal, use a .ps1 file extension as the generated script requires PowerShell to run. You can install PowerShell for Mac or Linux. |
TARGET-API-URL | The base API URL for your enterprise's subdomain of GHE.com. For example: https://api.octocorp.ghe.com . |
SSH-USER | If your Bitbucket Server instance runs on Linux, the username to use when connecting to your Bitbucket Server via SFTP |
PATH-TO-KEY | If your Bitbucket Server instance runs on Linux, the path to your SSH private key, such as ~/.ssh/id_rsa . For SSH key requirements, see Managing access for a migration from Bitbucket Server. |
SMB-USER | If your Bitbucket Server instance runs on Windows, the username to use when connecting to your Bitbucket Server via SMB |
ARCHIVE-DOWNLOAD-HOST | The host to use to connect to the Bitbucket Server/Data Center instance via SSH or SMB. You only need to specify this if you are running a Bitbucket Data Center cluster or your Bitbucket Server is behind a load balancer. |
After you generate the script, review the file and, optionally, edit the script.
- If there are any repositories you don't want to migrate, delete or comment out the corresponding lines.
- By default, repository names in will follow a
projectKey-repositoryName
convention. For example, a Bitbucket Server repository namedairports
that is part of theopen-source
project, which has the keyOS
, would be calledOS-airports
in . If you want any repositories to have a different name on , update the value for the corresponding---repo
flag.
If you downloaded BBS2GH as a standalone binary rather than as an extension for the CLI, you will need to update your generated script to run the binary instead of gh bbs2gh
.
To migrate your repositories, run the generated script.
You must follow this step from a computer that can access:
- Your Bitbucket Server instance via HTTPS
- Your Bitbucket Server instance via SFTP, if your Bitbucket Server instance runs on Linux. In general, if you can access the server via SSH, then you can also use SFTP.
- Your Bitbucket Server instance via SMB, if your Bitbucket Server instance runs on Windows
- Your chosen blob storage provider
Before running the script, you must set additional environment variables to authenticate to your blob storage provider.
- For AWS S3, set the following environment variables.
AWS_ACCESS_KEY_ID
: The access key id for your bucketAWS_SECRET_ACCESS_KEY
: The secret key for your bucketAWS_REGION
: The AWS region where your bucket is locatedAWS_SESSION_TOKEN
: The session token, if you're using AWS temporary credentials (see Using temporary credentials with AWS resources in the AWS documentation)
- For Azure Blob Storage, set
AZURE_STORAGE_CONNECTION_STRING
to the connection string for your Azure storage account.
Only connection strings using storage account access keys are supported. Connection strings which use shared access signatures (SAS) are not supported. For more information about storage account access keys, see Manage storage account access keys in the Azure documentation.
To migrate multiple repositories, run the script you generated above. Replace FILENAME in the commands below with the filename you provided when generating the script.
If you're using Terminal, use
./
.Shell ./FILENAME
./FILENAME
If you're using PowerShell, use
.\
.Shell .\FILENAME
.\FILENAME