Systems | Development | Analytics | API | Testing

Cloud

Moving Big Data and Streaming Data Workloads to AWS

Cloud migration may be the biggest challenge, and the biggest opportunity, facing IT departments today - especially if you use big data and streaming data technologies, such as Cloudera, Hadoop, Spark, and Kafka. In this 55-minute webinar, Unravel Data product marketer Floyd Smith and Solutions Engineering Director Chris Santiago describe how to move workloads to AWS EMR, Databricks, and other destinations on AWS, fast and at the lowest possible cost.

The Modern Data Eco System - How teams collaborate to unleash their data

With data becoming the main asset of a business, one of the biggest challenges is how to successfully leverage data to gain a business advantage. In the modern Data Eco System people with different skills set need to collaborate and work together to achieve their data objectives. How does a modern analytics team with data scientists, business analysts and data engineers work together? How are technologies such as Machine Learning, Big data and Cloud come together in a productive way.

How to Migrate Your Enterprise Data Warehouse to a Cloud Data Warehouse

Migrating a data warehouse from a legacy environment requires a massive upfront investment in resources and time. There is a lot to consider before and during migration. You may need to replan your data model, use a separate platform for tasks scheduling, and handle changes in the application’s database driver. Therefore, organizations must take a strategic approach to streamline the process. This article presents a step-by-step approach for migrating a data warehouse to the cloud.

Cloud-Based Data Analytics in Three Steps

Implementing a modern, cloud-based analytics stack doesn’t have to be hard — you can do it in three steps, actually. Implementing a modern data stack (MDS) — data integration tool, cloud data warehouse and business intelligence platform — is the best way to establish a successful analytics program as data sources and data volumes multiply.

Journey to the Cloud: From On-Prem to Public Cloud With Kong | Tyler Technologies

The team at Tyler Technologies has used Kong to help take the company’s applications from on-premise installations to multi-tenant cloud services. In this session, we’ll explore how Kong can help make this move without clients ever knowing it even happened, how we use Kong Brain to automatically generate OpenAPI documentation for our integrators and the AWS infrastructure choices we made to get a large, robust Kong instance running in AWS.

Cost Optimization on Microsoft Azure

Do you use big data and streaming services - such as Azure HDInsight, Databricks, and Kafka/EventHubs? Do you have on-premises big data that you want to move to Azure? Keeping costs down in Microsoft Azure is difficult, but vital. Join Chris Santiago of Unravel Data and explore how to to reduce, manage, and allocate streaming data and big data costs in Azure.

Cost-Effective, High-Performance Move to Cloud

The move to cloud may be the biggest challenge, and opportunity, facing IT departments today. In this 45-minute webinar, Unravel Data product marketer Floyd Smith and Solutions Engineering Director Chris Santiago describe how to move workloads to the cloud quickly, cost-effectively, and with high performance for the newly cloud-based workloads. Tune in to find out the best way to de-risk your cloud migration projects with data driven insights.

New Apache Kafka to AWS S3 Connector

Many in the community have been asking us to develop a new Kafka to S3 connector for some time. So we’re pleased to announce it's now available. It’s been designed to deliver a number of benefits over existing S3 connectors. Like our other Stream Reactors, the connector extends the standard connect config adding a parameter for a SQL command (Lenses Kafka Connect Query Language or “KCQL”). This defines how to map data from the source (in this case Kafka) to the target (S3).

Support for Calling External Functions via Azure API Management Now in Public Preview

In June, Snowflake announced the public preview of the external functions feature with support for calling external APIs via AWS API Gateway. With external functions, you can easily extend your data pipelines by calling out to external services, third-party libraries, or even your own custom logic, enabling exciting new use cases. For example, you can use external functions for external tokenization, geocoding, scoring data using pre-trained machine learning models, and much more.