Using cloud storage service for serving static files has become a common thing. The most popular option is using Amazon’s S3 (Simple Storage Service). There are some alternatives to S3, one of them is Google Cloud Storage (GCS). This tutorial shows you how to getting started with GCS from the registration process until getting the key required to manage objects in your cloud storage space. Then, I’ll give you examples how to upload file to Google Cloud Storage in Node.js, using @google-cloud/storage library.

1. Register and create project

Open Google Developer Console and create a new project

Create New Project

2. Enable Google Cloud Storage JSON API library

Google Cloud has a bunch of API libraries. In order to use a library, you must enable it first. Therefore you can control which libraries are enabled for a project. To find an API library, click Library on the sidebar.

Google Cloud Library Menu

Then use the search feature to find the library.

Google Cloud Library Search

On the Google Cloud Storage JSON API page, click on enable

Enable JSON API Library

 

3. Create a new credential

Not everyone can upload to your bucket. A user must be authenticated. As for authentication, your app needs a credential. Click Credentials on the sidebar menu, then click Create credentials.

 Google Cloud Create Credential

On the new page, you need to create new or select existing service account. Select the appropriate role and select JSON for key type

 Google Cloud Create Service Account Key

If success Private key saved to your computer.

4. Create a new bucket.

All uploaded files must be stored in a bucket. That's why you need to create at least one bucket. Open Google Cloud Storage Console. Then, click Create Bucket and type the name of the bucket you want to create.

 

After you have done all the steps above, now this is the coding time. For using the @google-cloud/storage library, first we need to call the constructor and pass the project ID along with the path to the .json authentication key in order to get authenticated by the service.

helpers/google-cloud-storage.js

  const GoogleCloudStorage = require('@google-cloud/storage');
  
  const GOOGLE_CLOUD_PROJECT_ID = 'gcs-demo-123456'; // Replace with your project ID
  const GOOGLE_CLOUD_KEYFILE = 'path-to-the-private-key'; // Replace with the path to the downloaded private key
  
  const storage = GoogleCloudStorage({
    projectId: GOOGLE_CLOUD_PROJECT_ID,
    keyFilename: GOOGLE_CLOUD_KEYFILE,
  });

In this tutorial, uploaded files will have public access. Files with public access can be accessed using the following URL format: https://storage.googleapis.com/${bucketName}/${fileName}. Therefore it's better to create a new helper function for generating public URL

helpers/google-cloud-storage.js

  /**
   * Get public URL of a file. The file must have public access
   * @param {string} bucketName
   * @param {string} fileName
   * @return {string}
   */
  exports.getPublicUrl = (bucketName, fileName) => `https://storage.googleapis.com/${bucketName}/${fileName}`;

I divide this tutorial of how to upload file to Google Cloud Storage using Node.js into two cases:

  • Upload a file from local path
  • Directly upload file stream to GCS

In the first case, you already have the file you want to upload in your local directory. We need to copy that file into a bucket in GCS. Here is the example.

helpers/google-cloud-storage.js

  /**
   * Copy file from local to a GCS bucket.
   * Uploaded file will be made publicly accessible.
   *
   * @param {string} localFilePath
   * @param {string} bucketName
   * @param {Object} [options]
   * @return {Promise.<string>} - The public URL of the uploaded file.
   */
  exports.copyFileToGCS = (localFilePath, bucketName, options) => {
    options = options || {};
  
    const bucket = storage.bucket(bucketName);
    const fileName = path.basename(localFilePath);
    const file = bucket.file(fileName);
  
    return bucket.upload(localFilePath, options)
      .then(() => file.makePublic())
      .then(() => exports.getPublicUrl(bucketName, gcsName));
  };

The first parameter is the path where your file is stored on your local directory. The second parameter is options – you can get the list of available options here, just find the JSDoc of bucket.upload. If the upload process is success, the function above returns the public link to the uploaded file.

The code above makes the uploaded file accessible by public because of the file.makePublic invocation. By default the file will be private. The options parameter which is an object supports predefinedAcl property with the following values:

  • authenticatedRead: Object owner gets `OWNER` access, and `allAuthenticatedUsers` get `READER` access.
  • bucketOwnerFullControl: Object owner gets `OWNER` access, and project team owners get `OWNER` access.
  • bucketOwnerRead: Object owner gets `OWNER` access, and project team owners get `READER` access.
  • private: Object owner gets `OWNER` access.
  • projectPrivate: Object owner gets `OWNER` access, and project team members get access according to their roles.
  • publicRead: Object owner gets `OWNER` access, and `allUsers` get `READER` access.

In addition, it also supports options.private and options.public, which is the shorthand of options.predefinedAcl = ‘private’ and options.predefinedAcl = ‘publicRead’ respectively. However using predefined ACL (including the shorthand) may cause the permission of the file cannot be changed using Google Cloud Storage bucket explorer. But at least you can still change it using file.makePublic or file.makePrivate later.

For the second case, for uploading file stream directly to Google Cloud Storage, we need to setup a new middleware for handling the uploading of the file.

middlewares/google-cloud-storage.js

  const gcsHelpers = require('../helpers/google-cloud-storage');
  
  const { storage } = gcsHelpers;

const DEFAULT_BUCKET_NAME = 'gcs-bucket-demo'; // Replace with the name of your bucket /** * Middleware for uploading file to GCS. * @param {Object} req * @param {Object} res * @param {Function} next * @return {*} */ exports.sendUploadToGCS = (req, res, next) => { if (!req.file) { return next(); } const bucketName = req.body.bucketName || DEFAULT_BUCKET_NAME;
const bucket = storage.bucket(bucketName); const gcsFileName = `${Date.now()}-${req.file.originalname}`; const file = bucket.file(gcsFileName); const stream = file.createWriteStream({ metadata: { contentType: req.file.mimetype, }, }); stream.on('error', (err) => { req.file.cloudStorageError = err; next(err); }); stream.on('finish', () => { req.file.cloudStorageObject = gcsFileName; return file.makePublic() .then(() => { req.file.gcsUrl = gcsHelpers.getPublicUrl(bucketName, gcsFileName); next(); }); }); stream.end(req.file.buffer); };

The code above wil make the file publicly accessible as well because we make it public intentionally using file.makePublic.

We also need to create a new route for handling file upload to GCS. In this example, the route uses multer as the middleware for handling file upload. Then we need to call the sendUploadToGCS middleware we have created before.

routes/index.js

  const Multer = require('multer');
  
  const multer = Multer({
    storage: Multer.MemoryStorage,
    limits: {
      fileSize: 10 * 1024 * 1024, // Maximum file size is 10MB
    },
  });
  
  router.post(
    '/upload',
    multer.single('image'),
    gcsMiddlewares.sendUploadToGCS,
    (req, res, next) => {
      if (req.file && req.file.gcsUrl) {
        return res.send(req.file.gcsUrl);
      }
  
      return res.status(500).send('Unable to upload');
    },
  );

Once you’ve restarted your server, you can try to call the API. One of the easiest method is using HTTP client such as Postman, which provides a file selector. You should get the link to the uploaded file in the response if the file uploaded successfully.