Systems | Development | Analytics | API | Testing

Analytics

Maximize Business Results with FinOps

As organizations run more data applications and pipelines in the cloud, they look for ways to avoid the hidden costs of cloud adoption and migration. Teams seek to maximize business results through cost visibility, forecast accuracy, and financial predictability. Watch the breakout session video from Data Teams Summit and see how organizations apply agile and lean principles using the FinOps framework to boost efficiency, productivity, and innovation. Transcript available below.

Enabling Strong Engineering Practices at Maersk

As DataOps moves along the maturity curve, many organizations are deciphering how to best balance the success of running critical jobs with optimized time and cost governance. Watch the fireside chat from Data Teams Summit where Mark Sear, Head of Data Platform Optimization for Maersk, shares how his team is driving towards enabling strong engineering practices, design tenets, and culture at one of the largest shipping and logistics companies in the world.

Getting Up to Speed on Snowpark for Python with Educational Services

In today's livestream, Evan Troyka and Melanie Klein will introduce the 1-day Snowpark DataFrame Programming course on Snowflake. This 1-day course covers concepts, features, and programming constructs intended for practitioners building DataFrame data solutions in Snowflake.

Learn How Snowflake's Own IT Department Built a Solution to Optimize Software Licenses

In this episode of “Data Cloud Now,” host Ryan Green sits down with Snowflake CIO and CDO Sunny Bedi to discuss how, in today’s uncertain economy, organizations use data and machine learning to maximize operational efficiency and control costs. Sunny and Ryan discuss how to use data and applications to optimize provisioning employees with the right technical resources and software. Snowflake itself has recently developed SnowPatrol — an internal application to analyze, predict, and optimize software spend.

Spark Technical Debt Deep Dive

Once in a while I stumble upon Spark code that looks like it has been written by a Java developer and it never fails to make me wince because it is a missed opportunity to write elegant and efficient code: it is verbose, difficult to read, and full of distributed processing anti-patterns. One such occurrence happened a few weeks ago when one of my colleagues was trying to make some churn analysis code downloaded from GitHub work.

Revamping Data Management Strategies with Data Pipelines

1. Data pipelines can improve data management strategies by enabling quick and easy data flow, transformation, and analysis. 2. Considerations when building a data pipeline include real-time data ingestion, scalability, performance optimization, data security and governance, and support for multiple sources. 3. Data mesh is a decentralized data architecture that organizes data sources by their specific business domains and must comply with the principles of the architecture. 5.

How Retailers Modernize Operations and Reporting with the Snowflake Retail Data Cloud

A modern data infrastructure is essential for retailers looking to stay competitive today. Companies are abandoning more traditional, on-premises IT infrastructures and moving to more centralized “as a service” (XaaS) models of delivery enabled by cloud technologies, according to McKinsey. Aging on-premises infrastructures are unable to meet demands for agility and innovation, eating up too much time and too many resources for teams trying to maintain them.

How Banks are Using Technologies to Help Underserved Communities

Financial inclusion, defined as the availability and accessibility of financial services to underserved communities, is a critical issue facing the banking industry today. According to the World Bank, 1.7 billion adults around the world do not have access to formal financial services, meaning that they cannot open a bank account or access credit, insurance, or other financial products.