Systems | Development | Analytics | API | Testing

Latest News

Can you achieve self-service analytics amid low data literacy?

Customers wanting to drive self-service analytics as part of creating a data-driven organization will often ask, “Can we achieve self service analytics, when our work force has low data literacy?” Or they might say they are not ready for self-service analytics, incorrectly thinking they need first to improve data literacy. But the two are inextricably linked. I liken it to teaching a child to read without giving them any books on which to build their skills.

An Ultimate Guide to Node.js Logging

Logging helps developers in reducing errors and cyber-attacks. The application is designed to be dynamic. We can't always predict how an application will react to data changes, errors, or program changes. Logging in allows us to better understand our own programs. For all applications, an absolute logging solution is essential. A good logging system increases the application's stability and makes it easier to maintain on the production server.

How to Make a Build vs. Buy Decision for a Software Solution

Buying software is often the answer for busy engineering teams in search of a quick solution with minimum aftercare. But while your team may be sure of the problem, how do you go about searching for a product to fix it? Far from being the 'easy option', there is a lot you need to consider before you invest in a bought solution – user experience, cost comparisons, and support features to name a few. Let’s explore some of the considerations when making a good decision.

Assessing security risks with Kafka audits

Suppose that you work for the infosec department of a government agency in charge of tax collection. You recently noticed that some tax fraud incident records went missing from a certain Apache Kafka topic. You panic. It is a common requirement for business applications to maintain some form of audit log, i.e. a persistent trail of all the changes to the application’s data. But for Kafka in particular, this can prove challenging.

Increase compliance with Kafka audits

Suppose that you work for a government tax agency. You recently noticed that some tax fraud incident records have been leaked on the darknet. This information is held in a Kafka Topic. The incident response team wants to know who has accessed this data over the last six months. You panic. It is a common requirement for business applications to maintain some form of audit log, i.e. a persistent trail of all the changes to the application’s data to respond to this kind of situation.

Interview with conversational AI specialist James Kaplan

For our latest specialist interview in our series speaking to technology leaders from around the world, we’ve welcomed James Kaplan CEO and Co-Founder of MeetKai. He founded the startup with his Co-Founder and Chairwoman, Weili Dai, after becoming frustrated with the limitations of current automated assistants. Kaplan has had a true passion for AI and coding since he was six. He wrote his first bot at only nine years old and wrote the first original Pokemon Go bot.

What Is Needed for an SFTP Connection?

Along with its security benefits, an SFTP connection is the quickest and most efficient way to transfer files between two local or remote systems. When transferring files or data from one server to another, using an SFTP connection is one of the best options to ensure this data remains untampered. Utilizing an SFTP connection is especially beneficial for commonly used data integration systems like ETL and Reverse ETL. So what makes SFTP so great, and what is even needed for an SFTP connection?

Pillars of Knowledge, Best Practices for Data Governance

With hackers now working overtime to expose business data or implant ransomware processes, data security is largely IT managers’ top priority. And if data security tops IT concerns, data governance should be their second priority. Not only is it critical to protect data, but data governance is also the foundation for data-driven businesses and maximizing value from data analytics. Requirements, however, have changed significantly in recent years.

Accelerating Insight and Uptime: Predictive Maintenance

Historically, maintenance has been driven by a preventative schedule. Today, preventative maintenance, where actions are performed regardless of actual condition, is giving way to Predictive, or Condition-Based, maintenance, where actions are based on actual, real-time insights into operating conditions. While both are far superior to traditional Corrective maintenance (action only after a piece of equipment fails), Predictive is by far the most effective.

Data Lakehouses: Have You Built Yours?

In traditional data warehouses, specific types of data are stored using a predefined database structure. Due to this “schema on write” approach, prior to all data sources being consolidated into one warehouse, there needs to be a significant transformation effort. From there, data lakes emerge!