JSON is Not Cool Anymore: Implementing Protocol Buffers in Node.js

Datetime:2016-08-23 03:11:14          Topic: Node.js           Share

There’s a better alternative to the ubiquitous JSON as the communication protocol of the web. It’s Protocol Buffers (protobuf). In a nutshell, protobuf offers a more dense format (faster processing) and provides data schemas (enforcement of structure and better compatibility with old code).

Protocol Buffers were introduced by Google. You can read, more about them at the official Protocol Buffers Developer Guide . For something shorter, read 5 Reasons to Use Protocol Buffers Instead of JSON For Your Next Service which will give you a quick overview of the protobuf benefits over JSON.

The purpose of this article is not to highlight why protobufs are better or sell you on the concept. There are many article online that’ll do it for you. The purpose of this article is to show you how you can get started with this format in the Node.js environment.

This article will walk you through a RESTful API implementation of protocol buffers with Node.js, Express.js, Asios and Protobuf.js. The code in this post runs on Node v6.2.0 because it’s written in the cutting-edge ES6/ES2015 version of the JavaScript language. We’ll have a message consisting of two fields text and lang sent as protobuf from the server, decoded and shown on the browser. We’ll also have a button which will send another protobuf message to the server. The source code’s in the GitHub repository azat-co/proto-buffer-api .

[Sidenote] : If you liked this post and want to learn more about Node.js, MongoDB, Express.js and React.js; if you are an experienced programmer who wants to learn MEAN or MERN quickly in a convenient format (not spending hours and hours searching for free blog posts on internet—you get what you pay for, right?), then take a look at this offer.

I'm giving access to ALL of my online courses (5), ALL of my ebooks (5) and ALL cheatsheets (4) in a one convenient BUNDLE . But that's not all! Often, courses and books are not enough. Often you need to ask an expert. That's why I included a 1 hour mentorship with me with the bundle !

Start mastering React.js, Node.js, MongoDB/Mongoose and Express.js. Upgrade your career with my mentorship. Become familiar with full stack JavaScript+Node.js stack. Acquire new skills and confidence in the future by getting the bundle . [End of sidenote.]

This will be the structure of our project:

/proto-buffer-api
  /public
    asios.min.js
    bytebuffer.js
    index.html
    long.js
    message.proto
    protobuf.js
  /node_modules
  index.js
  package.json

The public folder is where all of our browser assets will reside. We have Asios to make HTTP requests from the browser to server. It’s similar to Superagent or Request. You can also use jQuery to make HTTP requests. If you’re going to use a library other than Asios, just make sure you are submitting data as ArrayBuffer and sending it as application/octet-stream .

The Protobuf.js is the library to work with Google’s Protocol Buffers in JavaScript and Node.js so we’ll need protobuf.js file on the browser. It requires support for long numbers (numbers in JavaScript are limited to 53 bits in size as you know ) and there’s a neat library to allow us to work with 64-bit integers called long.js .

The message.proto is the prototype (schema) of the message object which we’ll send from server to browser and back. It looks like this:

message Message {
    required string text = 1;
    required string lang = 2;
}

Protobuf.js requires one more dependency— bytebuffer.js for the ArrayBuffer data type.

The format is relatively easy to understand. We have two fields text and lang . They re both required fields. The numbers next to the field names is something protocol buffers need for decoding/encoding.

The index.html has minimal HTML which contains libraries includes, <pre> container where we’ll insert the response from the server, the button which triggers sendMessage() (we’ll write it later), and the <script> tag with requests and protobuf code.

Sidenote: If you like this post and interested in a corporate on-site JavaScript, Node.js and React.js training to boost productivity of your team, then contact NodeProgram.com .

<html>
  <head>
    <script src="long.js"></script>
    <script src="bytebuffer.js"></script>
    <script src="protobuf.js"></script>
    <script src="axios.min.js"></script>
  </head>
  <body>
    <pre id="content"></pre>
    <button onClick="sendMessage()">send message to server</button>
    <script type="text/javascript">
        // Our requests and Protobuf code
    </script>
  </body>
</html>

Let’s dive deeper browser JavaScript and implement two requests: a GET request to fetch a message from the server and a POST request to send a message to the server. They both will have to work with protocol buffers.

First of all, we create Message from our prototype file message.proto . In the callback of loadProtoFile we can invoke loadMessage() to make the GET request to the server.

"use strict";
let ProtoBuf = dcodeIO.ProtoBuf
let Message = ProtoBuf
  .loadProtoFile('./message.proto', (err, builder)=>{
    Message = builder.build('Message')
    loadMessage()
  })

Axios library take as the first argument the URL of the request and as a second the request options. One of the options we must provide is arraybuffer . This will tell the HTTP agent to give us the appropriate data type back. Axios works with promises, so the in then callback, we can get response , log it and decode using Message.decode() :

let loadMessage = ()=> {
  axios.get('/api/messages', {responseType: 'arraybuffer'})
    .then(function (response) {
      console.log('Response from the server: ', response)
      let msg = Message.decode(response.data)
      console.log('Decoded message', msg)
      document.getElementById('content').innerText = JSON.stringify(msg, null, 2)
    })
    .catch(function (response) {
      console.log(response)
    })
}

The result of the GET request is shown in DevTools in the screenshot below. You can observe that the response’s in application/octet-stream :

As for the sending of the protocol buffers to the server, make sure to create object with new Message(data) and then invoke msg.toArrayBuffer() . It’s a good idea to set the Content-Type header to application/octet-stream so server knows the format of the incoming data:

let sendMessage = ()=>{
  let msg = new Message({text: 'yo', lang: 'slang'})
  axios.post('/api/messages', msg.toArrayBuffer(),
      { responseType: 'arraybuffer',
      headers: {'Content-Type': 'application/octet-stream'}}
    ).then(function (response) {
      console.log(response)
    })
    .catch(function (response) {
      console.log(response)
    })
}

The result of POST with the appropriate Content-Type and payload is shown in the screenshot below:

We have the front-end done, but it won’t work with our the server code so let’s implement Node/Express code next. First of all, you will want to create the package.json . Feel free to copy this file which has the dependencies:

{
  "name": "proto-buffer-api",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "author": "Azat Mardan",
  "license": "MIT",
  "dependencies": {
    "express": "^4.13.4",
    "protobufjs": "^5.0.1"
  }
}

Once you have package.json , you can install the dependencies with npm i . It will install express for building the HTTP server and protobufjs for working with Protocol Buffers on the server.

Let’s implement the server code first. In index.js , we import dependencies, create the express object and apply the static middleware for the public folder:

let path = require('path')
let express = require('express')
let app = express()
let publicFolderName = 'public'
app.use(express.static(publicFolderName))

Next, we’ll use an in-memory store to simplify this project. In other words, the data for the GET request will be coming from an array:

let messages = [
  {text: 'hey', lang: 'english'},
  {text: 'isänme', lang: 'tatar'},
  {text: 'hej', lang: 'swedish'}
]

Typically, you would use body-parser for parsing JSON requests. In order to process the incoming protobuf correctly, we need to parse it as array of buffers. Let’s implement our own custom middleware to parse protobufs and store them in body.raw (the decorator pattern). We need to create body.raw only when the header Content-Type is application/octet-stream and when there’s data ( data.length>0 ):

app.use (function(req, res, next) {
  if (!req.is('application/octet-stream')) return next()
  var data = [] // List of Buffer objects
  req.on('data', function(chunk) {
      data.push(chunk) // Append Buffer object
  })
  req.on('end', function() {
    if (data.length <= 0 ) return next()
    data = Buffer.concat(data) // Make one large Buffer of it
    console.log('Received buffer', data)
    req.raw = data
    next()
  })
})

Now we can create the builder object and “build” Message from our prototype file. We use the same prototype file public/message.proto as our front-end code:

let ProtoBuf = require('protobufjs')
let builder = ProtoBuf.loadProtoFile(
  path.join(__dirname,
  publicFolderName,
  'message.proto')
)
let Message = builder.build('Message')

Now we can implement GET in which we create a new message, encode it and convert to the Buffer type before sending back to the front-end client. Express’s response.send() is staking care of adding the proper ‘Content-Type’. You can use response.end() as well:

app.get('/api/messages', (req, res, next)=>{
  let msg = new Message(messages[Math.round(Math.random()*2)])
  console.log('Encode and decode: ',
    Message.decode(msg.encode().toBuffer()))
  console.log('Buffer we are sending: ', msg.encode().toBuffer())
  // res.end(msg.encode().toBuffer(), 'binary') // alternative
  res.send(msg.encode().toBuffer())
  // res.end(Buffer.from(msg.toArrayBuffer()), 'binary') // alternative
})

In the POST request handler, we decode from body.raw (it’s populated by the middleware which we defined earlier), and log in the terminal:

app.post('/api/messages', (req, res, next)=>{
  if (req.raw) {
    try {
        // Decode the Message
      var msg = Message.decode(req.raw)
      console.log('Received "%s" in %s', msg.text, msg.lang)
    } catch (err) {
      console.log('Processing failed:', err)
      next(err)
    }
  } else {
    console.log("Not binary data")
  }
})

app.all('*', (req, res)=>{
  res.status(400).send('Not supported')
})

app.listen(3000)

If you typed all the code as I have, or copied from my GitHub repository azat-co/proto-buffer-api , you should see on a web page a random message from the server. Then, if you click on the button, you should see “yo” in the terminal /command prompt where your Node.js server is running.

That’s it. We implemented GET and POST to communicate in protocol buffers between Node.js/Express.js and browser JavaScript with Protobuf.js. We used Axios to make HTTP requests from the browser. It allowed us to work with promises and abstract some of the low-level XMLHttpRequest interface for working with binary data.

Google’s using Protocol Buffers for their API. Protobufs in many ways superior to JSON or XML, and with Protobuf.js and this quick tutorial you should be good to start using Protocol Buffers for your RESTful APIs!

To contact Azat, the main author of this blog, submitthe contact form.

Also, make sure to get 3 amazing resources to FREE when you sign up for the newsletter.

Simple. Easy. No commitment.





About List