Systems | Development | Analytics | API | Testing

%term

SaaS in 60 - Cyclic Group Dimensions

This week a new Dimension type for Master Item Dimensions is available, Cyclic groups. You all know that Master Items offer a centralized, reusable governed library of measures, expressions, visualizations and both single and drillable dimensions – now with Cyclic group defined dimensions – you can dynamically cycle through dimensions in every chart, at the same time using a click of a button or simple selection. Offering new ways to analyze data, save precious screen space and enable multiple use cases on a single chart.

Authentication and Authorization Using Middleware in Django

Django is a “batteries-included” framework. It has a wide array of built-in features that handle common web development tasks: URL routing, template engine, object-relational mapping (ORM), and database schema migrations. This makes it highly efficient for rapid development, allowing developers to focus on building their applications' unique aspects without reinventing basic functionalities.

How No-Code API Tools Automatically Generate APIs

At some point, anyone who has enjoyed using a computer has wondered if they could build their own app. But software development becomes intimidating fast if you’re not a programmer. Fortunately, there is a wide range of no-code platforms available today. Even in the enterprise, there’s demand for no-code development. As APIs have become one of the most important components of the modern application ecosystem, several no-code API solutions are now available.

Better See and Control Your Snowflake Spend with the Cost Management Interface, Now Generally Available

Snowflake is dedicated to providing customers with intuitive solutions that streamline their operations and drive success. As part of our ongoing commitment to helping customers in this way, we’re introducing updates to the Cost Management Interface to make managing Snowflake spend easier at an organization level and accessible to more roles.

Reimagine Batch and Streaming Data Pipelines with Dynamic Tables, Now Generally Available

Since Snowflake’s Dynamic Tables went into preview, we have worked with hundreds of customers to understand the challenges they faced producing high-quality data quickly and at scale. The No. 1 pain point: Data pipelines are becoming increasingly complex. This rising complexity is a result of myriad factors.

Data Prep for AI: Get Your Oracle House in Order

Despite the transformative potential of AI, a large number of finance teams are hesitating, waiting for this emerging technology to mature before investing. According to a recent Gartner report, a staggering 61% of finance organizations haven’t yet adopted AI. Finance has always been considered risk averse, so it is perhaps unsurprising to see that AI adoption in finance significantly lags other departments.

Data Accessibility: A Hurdle Before SAP's AI Integration

Unlocking the power of AI within SAP for your team requires overcoming a significant hurdle: data accessibility. SAP data’s complexity, spread across various modules, creates silos of information that your team might struggle to understand and utilize effectively. Inaccessible or misaligned SAP data will hinder your AI system’s ability to learn and deliver valuable results specific to your organization.

The engineering behind autoscaling with HashiCorp's Nomad on a global serverless platform

There are several ways to handle load spikes on a service. However, these methods are not cost-effective: you either pay for resources you don't use, or you risk not having enough resources to handle the load. Fortunately, there is a third way: horizontal autoscaling. Horizontal autoscaling is the process of dynamically adjusting the number of instances of a service based on the current load. This way, you only pay for the resources you use, and you can handle load spikes without any manual intervention.

Federated Connectivity: Unlocking Data Silos with API Gateways

"The whole is more than the sum of its parts." Aristotle is credited with this quote, and it's true in the world of data. Legacy systems typically approached their role in a limited manner. Each system was intended to be used by a certain user set and handle well-defined processes and associated data. The result was a disintegrated environment with data being difficult to obtain, and frequently out of date. The parts couldn't easily cooperate to make a whole.