Systems | Development | Analytics | API | Testing

Bridging The Gap Between Legacy Security And Modern Threat Detection

In this episode of “Powered by Snowflake” host Daniel Myers sits down with Anvilogic’s Security Strategist and Head of Product Marketing, Jade Catalano. Founded by a team of security industry vets, Anvilogic has helped ease legacy security systems into the future, enabling organizations to quickly detect, hunt, and respond to threats. This conversation covers the challenges of modernizing legacy security systems, how to manage an excess of data, a product demo, and more.

Health Check Command in Docker

Development is more reliable and streamlined with Docker. With Docker, developers can design fast, easy, and portable applications - both on desktops and in the cloud - by terminating redundant, mundane configuration tasks. Furthermore, to UIs, CLIs, APIs, and security, Docker's integrated end-to-end platform is built to incorporate across the entire application delivery pipeline. In Docker Containers, the HEALTH CHECK instruction determines their health status.

What is Data Security? - The Role of Analytics in Data Protection

Data security (or data protection) is a term often used in the context of analytics and business intelligence (BI). It encompasses a number of different policies, processes and technologies that protect an company's cyber assets against data breaches and threats. But what does all of that really mean, in relation to BI specifically?

What's Happening To Middleware In The Cloud-Native Era?

Spending two decades in the middleware field has given me deep insight into the evolution of this technology domain. I began my career as a software engineer in a platform group, building reusable components using technologies like object linking and embedding (OLE), the distributed component object model (DCOM) and common object request broker architecture (CORBA).

Episode 3 & 4 | Data Destination & Data Governance | Data Journey

What are data destinations? In a very abstract sense, data destination is another input along the series of process elements in a data pipeline. However, when calling out an element as the destination, it is really seen as the final destination such as a database, data lake or data warehouse. And yet, any element within the data pipeline has aspects of a final destination (and scaling challenges).

Managing Kuma Tokens with HashiCorp Vault

Managing tokens in Kuma can be a challenging and manual process. Both the user and dataplane token lifetimes need to be manually tracked and managed. This ultimately becomes a burden for DevOps, and long life tokens end up being used. In this session, you will learn how to: Kong Builders is a livestream series that takes our developer-focused toolsets and puts them on display in the best venue possible – building applications and connecting workloads.