

Part 1: Business Intelligence Data Indexing from PostgreSQL using Algolia – Use...
source link: https://www.algolia.com/blog/engineering/bi-data-indexing-from-postgresql-to-algolia-1/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Part 1: Business Intelligence Data Indexing from PostgreSQL using Algolia – Use Case and Proposed Solution
Oct 10th 2022 engineering

We invited our friends at Starschema to write about an example of using Algolia in combination with PostgreSQL. Enjoy this two-part series by Software Developer Mátyás Budavári!
If you would like to skip ahead to part two, where the implementation is covered, click here.
Hello, I’m Mátyás, a software developer at Starschema. This post is about the audit of one of our products and how we can leverage the speed and distributed nature of Algolia to let our moderators search through the audit data without having access to the underlying PostgreSQL database.
Use Case
Our tool lets the users of our enterprise partners access all their reports from different Business Intelligence (BI) software and other resources in a centralized place. The most popular BI tools that we work with are Tableau and PowerBI. BI tools help communicate, present and analyze data with powerful interactive visualizations that can work even with real-time data. Managing multiple data sources and dashboards is often a frustrating and time-consuming experience filled with noise that harms the ability to derive insights and leads to fallible decisions.
We developed a service to improve the data analysis and presentation process by enabling individuals or teams to curate and collaborate with customized shared workspaces. Users can focus on analyzing data and communicating insights without having to shuffle between applications and authenticate multiple times. The contents of these reports are still served from the original tools, and we provide a unified experience on top of them. In addition, our users can modify the visible name of these reports and add additional tags and other metadata that make it easier for their teams to organize these reports.

We collect usage data periodically, which enables us to fine-tune the application based on how our users actually use it. In addition to collecting these metrics, we must comply with security requirements. We need to be able to tell who could access certain reports at a given time and what changes were made in the application by whom in a specific time range. We collect so-called audit logs to be able to extract this information. This data must be handled with care, and only people with higher privileges shall have access to the raw data.
In a recent project, we needed to develop an optional extension to extract anonymized audit log information into an external place out of our PostgreSQL database to let a team of content managers access that data quickly. These audit logs can contain JSON metadata, and it is hard to set up advanced search features with PostgreSQL. It’s better to index the data so that it will speed up the queries.
What we need
We need data about the usage distribution to present to the creators of the BI reports. Both Tableau and PowerBI provide usage metrics on their reports, but we’d like to show the usage information in a unified manner. We can show them detailed information on how their content is accessed through our application.
Since the report owners often want to know the peak periods of their insights, we need audit information to see what reports were accessed by whom through our application.
Note that the external BI sources manage permissions in their specific way. For this reason, the creators need to set the audience access up in the tool they use to publish the reports, we synchronize this information, and only those BI reports will show up for the given users what they can access.
How we can achieve it
We mainly use our application in internal and protected networks. Most of the time, due to strict policies, it’s not possible to connect dedicated analytics services so we save our analytics data into our database. However, most of the time, our analytics team does not have access to the database. We don’t send out actual user data and report metadata. Sometimes, we can send out anonymized data and tell them what specific identifiers represent in a different medium if it’s relevant and we’re allowed to share.
All in all, we need a way to query the usage logs fast without access to the original database.
We decided to use Algolia as the search index. We don’t want to host, manage and maintain our search index service per installation, and it’s much better to have a managed centralized place where we can send our data. Our focus groups can use it to compose the reports they need based on the data sent over there.
Out of security and respect for our clients, it’s an opt-in feature.
Architecture
Our backend service of the application is written in Golang. We store the connection information and report metadata for the BI Tools and the usage information in a PostgreSQL database.
We aim for a solution as close to the database as possible. We want to leverage the language features that PostgreSQL provides without any extensions. And we don’t want to modify our code too much to support this data gathering, though we’re open to running small microservices that communicate with Algolia.
To send our data to Algolia, we need to write some kind of code outside PostgreSQL.
We want to create the data load component as a separate service with the single purpose of sending data to the index. Due to the privacy of our users and reports, we can not expose our database to the above-mentioned external auditors. We can only send them a selected set of information.
Proposed solution
The audit log table can grow large over time in proportion to usage. We don’t want to query the whole log table every time we need to send out data. Having a small queue, that takes care of the data load is better. Each log line is necessary, but we can’t rely on these small services to be available at any point in time. We have to save what rows are needed to be processed.
We plan to create a new table in PostgreSQL that stores the filtered data that is yet to be sent to Algolia. This table will act as a LIFO (last-in-first-out) queue to prioritize the new data over the old ones.
We can send our already collected data into Algolia upon creating this new queue table.
We aim to point a trigger to the source table, and on every insert statement, a PostgreSQL procedure copies the necessary fields into this new table.
We plan to write a new service that periodically reads the new table, gets the last few items, and sends them to Algolia. It would be even better if multiple such mini-services could run simultaneously without interrupting each other and work on the queue independently.
In the next part of this article series, we’ll put the plan in motion by setting up the required triggers, seeding some random test data, and finally setting up the Producer and Consumer code to handle syncing data from PostgreSQL to our Algolia Index.
Recommend
-
5
Raymond Camden
-
11
Evolution of search engines architecture - Algolia New Search Architecture Part 1...
-
9
Scaling indexing and search - Algolia New Search Architecture Part 2...
-
5
Introduction In this article we’ll discuss how RavenDB indexes documents and how it differentiates itself from MongoDB, another NoSQL document database, as well as the relational database, PostgreSQL. When it comes to indexing...
-
9
GraphQL search and indexing with AlgoliaWidely used on the web, GraphQL brings many advantages to developers. On the front end, it reduces roundtrips and enables better client-si...
-
6
Part 3: Supercharging search for ecommerce solutions with Algolia and MongoDB — Data pipeline implementationJul 14th 2022 enginee...
-
12
Part 4: Supercharging search for ecommerce solutions with Algolia and MongoDB — Frontend implementation and conclusionJul 14th 2022
-
7
Part 1: Supercharging search for ecommerce solutions with Algolia and MongoDB — Use-case, architecture, and current challengesJul 14th 2022
-
7
Part 2: Business Intelligence Data Indexing from PostgreSQL using Algolia – Implementation and conclusionOct 10th 2022 engineerin...
-
4
Indexing Markdown content with AlgoliaJan 25th 2023 engineering
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK