

LinkedIn Adopts Protocol Buffers for Microservices Integration and Reduces Laten...
source link: https://www.infoq.com/news/2023/07/linkedin-protocol-buffers-restli/?itm_source=infoq&itm_medium=popular_widget&itm_campaign=popular_content_list&itm_content=
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

LinkedIn Adopts Protocol Buffers for Microservices Integration and Reduces Latency by up to 60%
Jul 19, 2023 2 min read
LinkedIn adopted Protocol Buffers for exchanging data between microservices more efficiently across its platform and integrated it with Rest.li, their open-source REST framework. After the company-wide rollout, they reduced the latency by up to 60% and improved resource utilization at the same time.
The LinkedIn platform employs a microservices architecture, and for years now, JSON has been used as the serialization format for over 50 thousand API endpoints exposed by microservices at LinkedIn. To help their teams build consistent interactions between services, the company created a Java framework called Rest.li, which became open-sourced.
The framework helps create servers and clients that use the REST style of communication and abstracts away many aspects of data exchange, including networking, serialization, or service discovery. It primarily supports Java and Python but can also work with Scala, Kotlin, JavaScript, Go, etc.

Data and Control Flow Between a Rest.li Server and Client (Source: Rest.li Documentation)
JSON is the default serialization format in Rest.li and has been selected due to its wide language support and being human-readable. The last property, however beneficial, introduces problems from the performance (and particularly latency) point of view.
Karthik Ramgopal and Aman Gupta, engineers at LinkedIn, share challenges with using JSON for inter-service communication:
The first challenge is that JSON is a textual format, which tends to be verbose. This results in increased network bandwidth usage and higher latencies, which is less than ideal. [...] The second challenge we faced was that due to the textual nature of JSON, serialization and deserialization latency and throughput were suboptimal.
The team has been considering alternatives to JSON, looking for a compact payload size and high serialization efficiency to reduce latency and increase throughput. They also didn’t want to limit the number of supported language stacks and enable gradual migration by integrating the new serialization mechanism into Rest.li. Finally, after a comprehensive review, they decided to go with Protocol Buffers (Protobuf), which scored the highest, based on the defined criteria.
The main difficulty around integrating Protocol Buffers into Rest.li was the dynamic schema generation based on the framework's custom schema definition system, PDL. The solution involved generating a symbol table that is used to generate Protobuf schema definition dynamically, but the method for delivering symbol tables varied depending on the type of client. Backend clients fetch and cache symbol tables on-demand, while for web/mobile apps, symbol tables are generated at build-time and included as versioned dependencies.
After changes to the framework were rolled out, the team gradually reconfigured the clients to enable Protobuf instead of JSON using HTTP headers. The result of Protocol Buffers adoption was an average increase in throughput by 6.25% for responses and 1.77% for requests. The team also observed up to 60% latency reduction for large payloads.

Latency comparison between JSON and Protobuf (Source: LinkedIn Integrates Protocol Buffers With Rest.li for Improved Microservices Performance)
Based on the learnings from the Protocol Buffers rollout, the team is planning to follow up with migration from Rest.li to gRPC, which also uses Protocol Buffers but additionally supports streaming and has a large community behind it.
See also the InfoQ Podcast: API Showdown: REST vs. GraphQL vs. gRPC – Which Should You Use?
About the Author
Rafal Gancarz
Rafal is an experienced technology leader and expert. He's currently helping Starbucks make its Commerce Platform scalable, resilient and cost-effective. Previously, Rafal has been involved in designing and building large-scale, distributed and cloud-based systems for Cisco, Accenture, Capita, ICE, Callsign and others. His interests span architecture & design, continuous delivery, observability and operability, as well as sociotechnical and organisational aspects of software delivery.
Show more</div
Recommend
-
148
vim-protobuf Vim syntax highlighting for Google's Protocol Buffers Introduction This plugin exists for the sole purpose of making it easy to install syntax highlighting for protocol buffer as a pathogen bundle. If you are...
-
102
GitHub is where people build software. More than 28 million people use GitHub to discover, fork, and contribute to over 79 million projects.
-
98
-
64
-
106
还在用 JSON?Google Protocol Buffers 更快更小(iOS 实践篇)
-
166
README.md prototool
-
87
GitHub is where people build software. More than 27 million people use GitHub to discover, fork, and contribute to over 80 million projects.
-
34
GitHub is where people build software. More than 28 million people use GitHub to discover, fork, and contribute to over 85 million projects.
-
41
我经历的 Protocol Buffers 那些坑 Orig...
-
11
WhatsApp Adopts the Signal Protocol for Secure Multi-Device Communication Jul 21, 2021...
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK