Systems | Development | Analytics | API | Testing

Latest News

Data Readiness and Quality: The Big New Challenges for all Companies

We live in a digital age which is increasingly being driven by algorithms and data. All of us, whether at home or work, increasingly relate to one another via data. It’s a systemic restructuring of society, our economy and institutions the like of which we haven’t seen since the industrial revolution. In the business world we commonly refer to it as digital transformation. In this algorithmic world, data governance is becoming a major challenge.

Meeting SLAs for Data Pipelines on Amazon EMR With Unravel

A household name in global media analytics – let’s call them MTI – is using Unravel to support their data operations (DataOps) on Amazon EMR to establish and protect their internal service level agreements (SLAs) and get the most out of their Spark applications and pipelines. Amazon EMR was an easy choice for MTI as the platform to run all their analytics. To start with, getting up and running is simple. There is nothing to install, no configuration required etc.

Generating a Heat Map with Twitter data using Pipeline Designer - Part 1

For me, the most exciting thing about Pipeline Designer is the way that it makes working with streaming data easy. Traditionally this has required a completely different way of thinking if you have come from a "batch" world. So when Pipeline Designer was released, the first thing I wanted to do was to find a good streaming data source and do something fun and interesting with the data. Twitter was my first choice of streaming data.

Making Data Work With the Unravel Partner Program

Modern data apps are increasingly going to the cloud due to their elastic compute demands, skills shortages and the complexity of managing big data on premises. And while more and more organizations are taking their data apps to the cloud to leverage its flexibility, they’re also finding that it is very challenging to assess application needs and how to migrate and optimize their data to ensure performance and cost targets are not compromised.