Systems | Development | Analytics | API | Testing

%term

Data Science vs. Data Engineering: What You Need to Know

According to The Economist, “the world’s most valuable resource is no longer oil, but data.” Despite the value of enterprise data, much has been written about the so-called “data science shortage”: the supposed lack of professionals with knowledge of how to use and manipulate big data. A 2018 study by LinkedIn estimated that there were more than 151,000 unfilled jobs in the U.S. requiring data science skills.

What is Low Code?

Businesses are increasingly demanding new software solutions that are quick, efficient, and user friendly. Low code is a way to automate several steps of the application process while still providing rapid delivery. In simplest terms, low code is a way of building processes and applications with very little coding. There are several aspects of this type of software development you need to understand when fully answering the question, what is low code?

How to Build Real-Time Feature Engineering with a Feature Store

Simplifying feature engineering for building real-time ML pipelines might just be the next holy grail of data science. It’s incredibly difficult and highly complex, but it’s also desperately needed for multiple use cases across dozens of industries. Currently, feature engineering is siloed between data scientists, who search for and create the features, and data engineers, who rewrite the code for a production environment.

Enabling The Full ML Lifecycle For Scaling AI Use Cases

When it comes to machine learning (ML) in the enterprise, there are many misconceptions about what it actually takes to effectively employ machine learning models and scale AI use cases. When many businesses start their journey into ML and AI, it’s common to place a lot of energy and focus on the coding and data science algorithms themselves.

Spark APM - What is Spark Application Performance Management

Apache Spark is a fast and general-purpose engine for large-scale data processing. It’s most widely used to replace MapReduce for fast processing of data stored in Hadoop. Designed specifically for data science, Spark has evolved to support more use cases, including real-time stream event processing. Spark is also widely used in AI and machine learning applications.

How Data Robot Automates Artificial Intelligence | Part 1 | Snowflake Inc.

Dan Wright, President and COO of DataRobot, talks about how DataRobot is revolutionizing artificial intelligence by automating an end-to-end experience to market for consumer use. He discusses how DataRobot's new platforms unlock powerful applications of deep learning technology that help leverage open source technology and provide security to customers. Rise of the Data Cloud is brought to you by Snowflake.

Decentralizing API Design at NAB | National Australia Bank

Continuing on the journey of building a new API platform, NAB’s primary focus this year has been improving API quality by building tooling for API design and API governance. This Kong Summit 2020 session will cover NAB’s new API search capability, which allows quick API discovery across the organization, and how the team implemented decentralized API governance.