Systems | Development | Analytics | API | Testing

Latest News

Data Management and the Four Principles of Data Mesh

A relatively new term in the world of data management, data mesh refers to the process of creating a single, unified view of all enterprise data. This process can happen in several ways, giving business users easy access to the data they require for decision-making. Several principles guide data mesh design and implementation. This article will discuss the principles of data mesh and how they can help your business get the most out of its data.

Tideways joins the Open Source Pledge

Tideways is joining the Open Source Pledge because we want to make a public commitment on our various open source contributions. Not only do we rely on open source software in our product Tideways, we are also building our business on top of the open source language PHP and its continued success. The mission of the recently started Open Source Pledge initiative is to establish a new social norm in the tech industry of companies paying Open Source maintainers.

Making Waves with AI: Ensure Smooth Sailing by Automating Shipping Document Processing

The year is 1424. You’re shipping goods across the world, and the ship in question gives you a bill of lading. It’s a piece of paper containing details about what your goods are, where you’re shipping them from, and where they’re headed. Fast forward to 2024. You’re shipping your goods across the world, and the shipping company gives you a bill of lading. It’s still (most likely) a piece of paper.

5 Strategies to Reduce ETL Project Implementation Time for Businesses

Picture this: You are part of a BI team at a global garment manufacturer with dozens of factories, warehouses, and stores worldwide. Your team is tasked with extracting insights from company data. You begin the ETL (Extract, Transform, Load) process but find yourself struggling with the manual effort of understanding table structures and revisiting and modifying pipelines due to ongoing changes in data sources or business requirements.

A practical guide to web scraping with Ruby

One of the benefits of Ruby's developer-friendly syntax is that it's straightforward to quickly build scripts to automate tasks. Web scraping with Ruby is fun, useful, and straightforward. In this article, we'll explore using HTTParty to pull a web page and check it for a given string. To be specific, we'll build a cron job in Ruby to check if a product is in stock on a website!

Orchestrating Konnect's Great API Renaming with Kong Gateway Transformer Plugins

Before we dive in, we want to remind our readers that the Konnect engineering team actively uses Kong products like Kong Gateway, Kong Mesh, and Insomnia. In this post, we'll showcase the power of Kong Gateway and two plugins — the JQ and Request Transformer Advanced Plugins — to govern and configure APIs, and explain how it played, and continues to play, a crucial role in the release of a new v2 Konnect API specification. The key takeaway here is: don’t do more work than is necessary.

Using AI in Insurance Underwriting for Accelerated Time-to-Value

For insurance companies, balancing customer expectations with the rigorous requirements necessary to mitigate risks poses a challenge. Especially when you’re using manual underwriting processes. By turning to artificial intelligence (AI) in insurance underwriting, you avoid costly delays, streamline your employees’ processes, improve accuracy, and create an optimal customer experience. In this blog post, you’ll learn how AI facilitates greater efficiency in underwriting.

More Fortune 500 Companies Are Adopting Snowflake Data Clean Rooms, Powering the Privacy-First Era

Privacy is no longer a growing requirement for doing business — it's the new status quo. The stakes for not protecting it have only intensified. Consumers have been demanding greater control and privacy over their data for years, and now vast numbers are taking action to protect it, turning off tracking, using cookieless environments and relying on ad blockers at rapidly increasing rates.

Heuristics in Software Testing: Hunt Bugs With Style

Heuristics is the key to turn you into the Turkish Olympic guy. It is something that comes with experience. At a certain point, experts all develop that “sense” in their field of work. Repeated exposure to hundreds of problems helps them recognize patterns that beginners don’t easily see. Heuristics is not always foolproof, but it's good for immediate problem-solving.

From Data Pipeline Automation to Adaptive Data Pipelines

Data pipeline automation plays a central role in integrating and delivering data across systems. The architecture is excellent at handling repetitive, structured tasks, such as extracting, transforming, and loading data in a steady, predictable environment, because the pipelines are built around fixed rules and predefined processes. So, they will continue to work if you maintain the status quo, i.e., as long as your data follows a consistent structure.