Systems | Development | Analytics | API | Testing

Analytics

Streamlined Data Movement: Fivetran's SAP ERP Integration with Databricks Explained

Join Kelly Kohlleffel from Fivetran as he demonstrates the seamless integration of SAP ERP data into the Databricks Data Intelligence Platform using Fivetran’s powerful data movement automation capabilities. Learn how the Fivetran SAP ERP for HANA connector effortlessly syncs your data to Databricks, making it ready for all data workloads. Discover how Fivetran utilizes Databricks’ features like Serverless and Unity Catalog to help you quickly build new data products and solutions with SAP ERP data efficiently in Databricks.

Fast-Track to Data Insights: Deliver Impactful Salesforce Sales Metrics to the Databricks Gold Layer

Join Kelly Kohlleffel from Fivetran in this demonstration that moves and transforms raw Salesforce data into impactful sales metrics in the Databricks Data Intelligence Platform. Learn how to set up a Salesforce to Databricks connector with Fivetran’s fully automated and fully managed data movement platform. Then watch how the new dataset in Databricks is automatically transformed from the Databricks bronze layer to the gold layer—making it analytics-ready and data product-ready in minutes.

Event-Driven Microservices in Banking and Fraud Detection | Designing Event-Driven Microservices

How do we know whether Event-Driven Microservices are the right solution? This is the question that Tributary Bank faced when they looked at modernizing their old fraud-detection system. They were faced with many challenges, including scalability, reliability, and security. Some members of their team felt that switching to an event-driven microservice architecture would be the magic bullet that would solve all of their problems. But is there any such thing as a magic bullet? Let's take a look at the types of decisions Tributary Bank had to make as they started down this path.

Top 3 Benefits of Automated Analytics

Imagine transforming raw business data into actionable insights with minimal effort. This is the value proposition of automated analytics, a form of data analytics fast becoming more accessible among modern business intelligence (BI) and analytics software solution vendors. Independent software vendors (ISVs) and enterprise organizations at the cusp of investing in analytics struggle with manual data processes that are time-consuming and prone to errors.

ANSI X12 vs EDIFACT: Key Differences

Electronic Data Interchange (EDI) is a popular communication method that enterprises use to exchange information accurately and quickly with trading partners. EDI transmits data almost instantaneously — serving as a fast and efficient mode for exchanging business documents. ANSI X12 vs. EDIFACT are the two most common EDI standards used for EDI, but they have different structures, styles, and usage.

What Is Database Schema? A Comprehensive Guide

A database schema, or DB schema, is an abstract design representing how your data is stored in a database. Database schemas can be visually represented using schema diagrams, such as the one below: A database schema diagram visually describes the following: Database schemas are at the heart of every scalable, high-performance database. They’re the blueprint that defines how a database stores and organizes data, its components’ relationships, and its response to queries.

Discover Financial Services Automates Data Ingestion for Real-Time Decision-Making at Scale

Making operational decisions in a tight timeframe is critical to the success of an organization. Real-time data ingestion enables quicker data availability, in turn enabling timely decision-making. Real-time ingestion is foundational to our digital transformation at Discover Financial Services. As a senior manager leading the streaming and real-time data platforms at Discover, I don’t want to be in the data replication business manually.

Snowflake Expands Partnership with Microsoft to Improve Interoperability Through Apache Iceberg

Today we’re excited to announce an expansion of our partnership with Microsoft to deliver a seamless and efficient interoperability experience between Snowflake and Microsoft Fabric OneLake, in preview later this year. This will enable our joint customers to experience bidirectional data access between Snowflake and Microsoft Fabric, with a single copy of data with OneLake in Fabric.

LLM Validation and Evaluation

LLM evaluation is the process of assessing the performance and capabilities of LLMs. This helps determine how well the model understands and generates language, ensuring that it meets the specific needs of applications. There are multiple ways to perform LLM evaluation, each with different advantages. In this blog post, we explain the role of LLM evaluation in AI lifecycles and the different types of LLM evaluation methods. In the end, we show a demo of a chatbot that was developed with crowdsourcing.