Discover Resources on GCP
Prerequisites
Before proceeding with Cloud to Code CLI, you’ll need to meet the requirements listed below.
- Install Homebrew (macOS and Linux) and Terraform.
- Install
cloud2code
on your system. Check out OS-specific installation guides. - If you do not have GCP CLI installed, refer to the GCP documentation.
Authenticate Your GCP CLI
-
Run the following command to authenticate your GCP CLI:
gcloud auth login
-
Enter your Application Credentials Json Path and Output Format.
-
Follow the on-screen instructions to finish authenticating.
Refer to the GCP documentation to learn more about CLI authentication.
Read Access to Regions
Ensure you have the necessary permissions to access your cloud regions and resources. If unsure, check with your cloud administrator or run the following command to verify your account details:
gcloud asset search-all-iam-policies \
--scope=projects/PROJECT_ID \
--query="policy:email@example.com" \
--format="table(resource, policy.bindings.role)"
Replace PROJECT_ID
and Email
with your relevant details.
Create a Terraform State File
Follow these steps to create and import a tfstate
file:
- Use the command line to create a local Terraform folder and switch your directory to the local folder.
- Run the following command to create a terraform state file for Storage buckets and Compute instances from a specified region. You will find your tfstate file in the directory specified in Step 1.
Usage
- macOS
- Windows-Docker
- Linux
cloud2code import gcp --region <region> --project-id <project-id> --credentials <credentials-file-path> --include <resource_types> --exclude <resource_types> --output-dir
docker run --platform linux/arm64 --rm `
-e = "***************" ` //
-e GOOGLE_APPLICATION_CREDENTIALS = "***************" ` // Json Credentials File Path
-v C:\Users\abc\output:/output `
ghcr.io/stackgenhq/cloud2code import gcp --region us-east1 --include google_storage_bucket --output-dir /output
Docker Flags Explained
Option | Description |
---|---|
--platform | Ensures that ghcr.io/stackgenhq/cloud2code container runs using the ARM64 architecture, regardless of your host machine's architecture. |
-e | Sets the environment variables. |
-v | Used for volume mounting. C:\Users\username\output is the directory path on your local Windows machine, while /output is the directory path inside the Docker container. |
cloud2code import gcp --region <region> --project-id <project-id> --credentials <credentials-file-path> --include <resource_types> --exclude <resource_types> --output-dir
cloud2code
Flags Explained
Flag | Description | Required |
---|---|---|
--region | The GCP region from which resources will be imported. | Yes |
--project-id | The GCP Project ID from which resource types to import. | Yes |
--credentials | The local GCP credentials json file path. | Yes |
Example:
cloud2code import gcp --region us-east1 --include google_storage_bucket --include google_compute_instance --output-dir “/Users/abc/Downloads/terraform”
The command above imports Storage buckets and Compute instances from the us-east1 region
into Terraform configuration files (.tf
).