Securely Accessing Our Google Cloud SQL Instances
Our organization uses GCP’s Cloud SQL service to host our relational databases. A challenge we had was determining how to ensure that no un-authenticated traffic can access the database, while at the same time allowing developers who need access to the database the ability to easily view needed tables.
In terms of compute resources that rely on the Cloud SQL DBs, our APIs running in GCP’s Cloud Run service would also require access to these databases.
Furthermore, we wanted to implement the simplest, least expensive option available to us.
Initial Approach: VPC + VPN
We initially explored placing our DB within our VPC with only a private endpoint, creating an organization-wide VPN that can access the VPC, and then providing access to our Cloud Run instances via a Serverless VPC Access Connector.
We identified some VPN providers, and after building out the GCP resources to support this solution realized that although this route is sometimes viewed as “canonical”, it was burdening us with some potentially avoidable costs:
- Getting a VPN with a dedicated Server IP costs ~$480 / year, + $108 / developer from a service like NordLayer.
- For a development team like ours without a significant networking background, the cost of setting the service up and maintaining it is not trivial.
- Least importantly, but still worth noting, introducing a VPN is another tool that new team members require provisioned access to.
New Approach: Locked-down Public IP with Google Cloud Proxy
After some more research, we realized we could accomplish our goals less expensively, with no additional cloud infrastructure, and a self-serve onboarding process.
The solution we settled on was to provision a public endpoint for our database instances, but allow no external networks to access the instance. We then have encrypted DB access through Google’s Cloud SQL Proxy Client for users that have the requisite IAM permissions.
Now, our developers who need access to the databases, can do the following:
- Run: gcloud auth login
- Run: gcloud auth application-default login
- Download the Cloud SQL Proxy Client
- Run: PATH/TO/YOUR/EXECUTABLE/cloud_sql_proxy.exe -instances=your_connection_name=tcp:3306
Some comments: Chances are that they have already run steps one and two previously. Lastly, the port depends on your DB type, with 3306 for MySQL and 5432 for Postgresql.
JetBrains has an excellent guide for following these steps to connect to Cloud SQL via their DataGrip tool.
For our Cloud Run Instance, connecting to the Cloud SQL instance is as straightforward as adding the DB of interest to the “Cloud SQL Connections” line in the instance’s configuration. Google has good documentation explaining how this connection works for those curious.
But isn’t this less secure than the VPC approach?
This is a question that should not be brushed aside, given that databases store business critical information.
It helps to do a side by side comparison:
Overall, both solutions offer excellent security. Given the restricted IAM permissions, the Cloud SQL Proxy solution makes it easier to ensure certain developers only have access to specific database instances with no marginal cost or complexity per user.
We hope this helps you decide on the simplest, least expensive, and secure method to access your Cloud SQL instances. Given our serverless GCP compute usage, using a public endpoint and SQL Cloud Proxy seemed to be a clear winner for our use case.
dragondrop.cloud’s mission is to automate developer best practices while working with Infrastructure as Code. Our flagship OSS product, cloud-concierge, allows developers to codify their cloud, detect drift, estimate cloud costs and security risks, and more — while delivering the results via a Pull Request. For enterprises running cloud-concierge at scale, we provide a management platform. To learn more, schedule a demo or get started today!
Learn More About Google Cloud (GCP)
What is Google Kubernetes Engine (GKE)? Kubernetes (K8s) is an extremely popular, open source, container orchestration platform. GKE is a managed offering for running Kubernetes clusters in GCP. GKE handles a lot of the heavy lifting, fully managing the K8s cluster...
When it comes to running containerized, long-duration workloads in the cloud, AWS Fargate Tasks and GCP Cloud Run Jobs are two options worth considering. AWS Fargate Tasks have been around for quite some time in cloud years (since 2018 ), while GCP Cloud Run Jobs were...
How can long running Cloud Run Jobs be dynamically trigged via an API endpoint?