Google Cloud Platform (GCP), is a product of Google. It is a cloud computing service that runs on the same infrastructure that Google uses internally for its end-user products, such as YouTube, Google Search, Gmail, etc.
You can do any data related work starting from data engineering, data analytics to Artificial intelligence in GCP with investing very little amount of time.
This post is part 1 of Google Cloud Platform tutorial series. In this post, I will give you a basic understanding of the Google Cloud Platform. In my next post, I will show you end to end process of implementing Machine learning project in GCP.
How to use Google Cloud Platform for free
GCP currently offers $300 US dollars of free credit for 3 months of free trial. You can use it to learn GCP. You won’t be charged at the end of your trial. You will be notified and your services will stop running until you decide to upgrade your plan. The safest option is to block your credit card and order another after using free trial.
Why would you migrate to GCP?
There are lots of benefit of Google Cloud Platform, some of them are:
- No need to spend a lot of money on hardware
- Ability to scale on-demand, pay only for the resources you consume
- You can manage your APIs securely
- Data analytics and machine learning services are also available in GCP
Components of Google Cloud Platform
At this point hope you got some idea about Google Cloud Platform (GCP). Now I will walk you through some important components of GCP. I came across these components while working on Machine Learning in Google Cloud Platform. In this post I will give you an overview of these components. In my next post, I will show you how these components are useful in day-to-day project work and how to use these components of Google Cloud Platform.
1. Cloud Shell
Google cloud shell is an online command-line shell window that automatically connects to your Google cloud platform account. Google cloud shell is used to configure and manage projects. Work and syntax wise it is similar to Linux shell commands.
To launch the cloud shell click Activate Cloud Shell button at the top right of the console.
Once it opens you can use gcloud shell as Linux shell. You can use it for:
- Exploring any files and folders for any project
- Authenticate any storage location or Authenticate gcloud using a service account
- write & Test Python code
You can also change them, font, color etc. of Cloud Shell by clicking Settings icon -> Terminal Preferences
There are lot of options to store data in Google Cloud Platform. Starting from Structured data, Unstructured data to real time data, Multi regional data etc.
3. Cloud IAM & Admin:
The full form of IAM is Identity and Access Management. Cloud Identity Management in Google Cloud Platform lets you manage access control by defining who has what role or permission GCP resources.
Now how many roles or permissions in GCP resources? There are three main roles which you can provide to a user which are: Owner, Editor & Viewer. You can also make Custom Roles as per user requirements.
There are many ways to grant a role for a user. such as Service Account, Google Account, Google group, etc. But in practical Service Account is mostly used. By Service Account you can assign permission to the resources. Resources can be storage (files and folder), API, etc.
Let’s consider an example to understand Service Account. Let’s say component 1 or the virtual machine placed in project A need access right to some other storage or data of different project (Project B). In this case, Service Account will help Component 1 to access that data.
To download Service account for a project:
- Click on left navigation menu icon -> IAM & Admin -> Service Accounts
- Now click on a particular Service Account
- Click on keys tab from top
- Click on ADD KEY -> Create new key
- Select key type json and click on CREATE button
One json file will be downloaded. That is the Service Account key. You can save that key to use multiple times.
Dataproc is a managed service for any OSS jobs which support big data processing including ETL and machine learning. It provides support for the most popular open-source software. You can use Dataproc to move anything from on-premise clusters to the Cloud.
Google Cloud Composer is to Automate the process. Let’s say that you have a project to collect, move, transform, and check data. And all these tasks need to be finished at the same time every month or day. This is where Google Cloud Composer is needed. Google Cloud Composer is a fully-managed version of Apache Airflow and will let you do this kind of task.
Understanding billing Of GCP
Now important question is what will be the cost of Google Cloud Platform for your project. It vary project to project and your requirement and configuration. You can visit GCP pricing calculator to understand your estimated GCP cost. Once you enter certain configuration, you will see your estimated cost.
For example, you can specify a certain instance type in a certain region along with a 100 gigabytes egress traffic per month to the Americas and EMEA. The pricing calculator will show you the total estimated costs for that.
In this post, I gave you a basic overview of Google Cloud Platform. In my next few post, I will show you how you can use above mentioned components of
GCP in your day-to-day project.
If you have any question or suggestion regarding this topic see you in comment section. I will try my best to answer.
Hi there, I’m Anindya Naskar, Data Science Engineer. I created this website to show you what I believe is the best possible way to get your start in the field of Data Science.
1 thought on “Basic understanding of Google Cloud Platform”
Good read. Thanks