Systems | Development | Analytics | API | Testing

BI

How to start a data literacy program in 6 steps

In a world where 2.5 quintillion bytes of data are created every day, it’s not surprising that organizations want to harness the power of being data-driven. In our 2022 Data Health Barometer, 99% of companies surveyed recognized that data is crucial for success — but 97% said they face challenges in using data effectively. Perhaps in response to those challenges, 65% of companies reported that they'd started a data literacy program.

Why You Should Move From Management Reporter to Jet Reports

Much like Apple people tend to be all Apple, all the time, Microsoft Dynamics ERP users tend to prefer Microsoft products for all their computing needs. It’s not hard to understand why. Using products from the same ecosystem prevents compatibility issues and saves time in learning multiple systems.

Building a Data-Centric Platform for Generative AI and LLMs at Snowflake

Generative AI and large language models (LLMs) are revolutionizing many aspects of both developer and non-coder productivity with automation of repetitive tasks and fast generation of insights from large amounts of data. Snowflake users are already taking advantage of LLMs to build really cool apps with integrations to web-hosted LLM APIs using external functions, and using Streamlit as an interactive front end for LLM-powered apps such as AI plagiarism detection, AI assistant, and MathGPT.

Using Dead Letter Queues with SQL Stream Builder

Cloudera SQL Stream builder gives non-technical users the power of a unified stream processing engine so they can integrate, aggregate, query, and analyze both streaming and batch data sources in a single SQL interface. This allows business users to define events of interest for which they need to continuously monitor and respond quickly. A dead letter queue (DLQ) can be used if there are deserialization errors when events are consumed from a Kafka topic.

Creating a basic write back solution with Qlik Cloud

Using Qlik Cloud Analytics and Qlik Application Automation you can create sophisticated solutions to solve many business problems. With Qlik's new properties in the action button object, you can now execute an Application Automation workflow while passing parameter / value pairs to the workflow. Check out this simple walk-through to see an example of writing back data to a MS Azure SQL database.

SaaS In 60 - New filter, scheduler interval, write back and more!

This week we’ve added a new customizable filter object, a new interval in the scheduler for alerts and reloads, support for Parquet data files and the ability to execute an app automation workflow from a button object in Qlik Cloud Analytics with variable value passing which can be used for a number of advanced automated workflow use cases, including write back.