Cross-platform use of Google Protobuf

Cross-platform use of Google Protobuf

Recently got chance to work with Google protocol Buffer(mostly called as Google Protobuf). This is an one of the simple alternative to pass the data over network. Instead of JSON we used Protobuf just for POC.

First question comes in mind why protobuf if JSON is mature enough to handle my all stuff? I will not go dip into this topic but providing reference for that https://auth0.com/blog/beating-json-performance-with-protobuf/

I am using microservice architecture in project and developed services in different languages and using Apache Kafka.

Currently this kind of architecture we are using in project.

Service1, Service2, Service3 are working just like Kafka Producer here. Parsing service works like Kafka Consumer.

While working on this architecture I faced many issues but the one particularly new and having less documentation is Protobuf issues.

Protobuf having following steps to use in project.

1) Create .proto file

2) Compile or use directly .proto file in project depends on language.

If you are using compiler option then you need to compile .proto file by using protoc tool of perticular language.

A very important point to note here is every language needs to compile the .proto file using protoc tool of that language

3) After compilation it creates pb2 file which we needs to use in our application

4) Now you just need to import .proto or pb2 file in project and create object just like creating any object in that language like using new.

For more detailed description check this link https://developers.google.com/protocol-buffers/docs/tutorials

My Profile.proto file looks like

#profile.proto
syntax = "proto3";

message Profile {
   string profileId = 1;
   Git github = 2;
   Linkedin linkedin = 3;
}

message Starred {
   string title = 1;
   string description = 2;
   string content = 3;
}

message Comment {
   string data = 1;
}

message Issue {
   string title = 1;
   string description = 2;
   repeated Comment comments = 3;
}

message Repo {
   string title = 1;
   string description = 2;
   string content = 3;
   repeated Issue issues = 4;
}

message Git {
   int64 id = 1;
   string login = 2;
   string name = 3;
   string email = 4;
   string company = 5;
   string location = 6;
   string bio = 7;
   string blog = 8;
   string reposUrl = 9;
   repeated Repo repos = 10;
   repeated Starred starred = 11;
}

message Linkedin {
   string id = 1;
   string first_name = 2;
   string last_name = 3;
   string email_address = 4;
   string formatted_name = 5;
   string headline = 6;
   string industry = 7;
   string picture_url = 8;
   string summary = 9;
   string location_name = 10;
   string location_code = 11;
   string created_date = 12;

}

I used this file as it is in nodeJS and while coding with Python I compiled this as

protoc -I=$SRC_DIR --python_out=$DST_DIR $SRC_DIR/profile.proto

Issues

1) How to use protobuf in nodejs?

const protobuf = require('protobufjs');
protobuf.load("profile.proto", function (err, root) {
   if (err) {
      throw err;
   }

   const ProfileMessage = root.lookupType("Profile"); //Find root message object
   const linkedinData = {
      profileId: emailId,
      linkedin: {
         id: userData.emailAddress ? userData.emailAddress : '',
         first_name: userData.firstName ? userData.firstName : '',
         last_name: userData.lastName ? userData.lastName : '',
         email_address: userData.emailAddress ? userData.emailAddress : '',
         formatted_name: userData.formattedName ? userData.formattedName : '',
         headline: userData.headline ? userData.headline : '',
         industry: userData.industry ? userData.industry : '',
         picture_url: userData.pictureUrl ? userData.pictureUrl : '',
         summary: userData.summary ? userData.summary : '',
         location_namelocationName: userData.location.name ? userData.location.name : '',
         location_code: userData.location.country.code ? userData.location.country.code : '',
         created_date: ''
      }
   };

   const errProfileMsg = ProfileMessage.verify(linkedinData); //verify the data againest protobuf structure
   if(errProfileMsg) {
      throw Error(errProfileMsg);
   }

   const message = ProfileMessage.create(linkedinData); //create final protobuf object
   const buffer = ProfileMessage.encode(message).finish(); //encoded or serialised this to pass over the Apache kafka
}

2) How to use pb2 file in python? I was using flask framework of python.

from api import Profile_pb2
from dict_to_protobuf import parse_dict
profile = Profile_pb2.Profile() // created empty object
item_dict = {'profileId': app.__getattribute__('emailId'), 'github': user.serialize()} //prepared data
parse_dict(item_dict, profile) //pushed data into empty object
encoded = profile.SerializeToString() //encode or serialze object to pass over the Apache kafka

3) How to consume data which is produced on kafka and again prepare original profile object? Here challenge is like different services using different style to produce data like node using .proto file and python use compiled pb2 file.

import Profile_pb2
import json
from google.protobuf.json_format import MessageToJson

try:
   #kafka and elastic connection code will come here
   for message in consumer:
   profile = Profile_pb2.Profile().FromString(message.value)
   if profile.github is not None:
      jsonObj = MessageToJson(profile.github)
      obj = json.loads(jsonObj) #converted into json because elasticsearch allows only json
      #elastic search insertion code here
   if profile.linkedin is not None:
      jsonObj = MessageToJson(profile.linkedin)
      obj = json.loads(jsonObj) #converted into json because elasticsearch allows only json
      #elastic search insertion code will come here
except Exception as e:
   # Logs the error appropriately.
   pass

Please free feel to ask query or questions. Please pardon me as this is my first technical article so might be I missed many things. You can connect with me using [email protected] or https://www.dhirubhai.net/in/pankaj-saboo/



Nitesh Gadekar

Golang | Kubernates | Google cloud | Docker |

6 年

Excellent

要查看或添加评论,请登录

社区洞察

其他会员也浏览了