When it comes to solutions for the big data sector, there is a clear split between the legacy and next-generation approaches to software development. Legacy vendors in this space generally have their own large internal development organizations, dedicated to building proprietary, bespoke software. It’s an approach that has worked well over the years.
Today, almost everyone has big data, machine learning and cloud at the top of their IT “to-do” list. The importance of these technologies can’t be overemphasized as all three are opening up innovation, uncovering opportunities and optimizing businesses. Machine learning isn’t a brand new concept, simple machine learning algorithms actually date back to the 1950s, though today it’s subject to large-scale data sets and applications.
Since the 1960s, scientists have been forecasting the weather using satellite-captured imagery. Although access to these satellite feeds used to be reserved just for meteorologists, these days anyone can jump online to find current satellite footage for their area. But what if you wanted to take things a step further? Maybe you’re curious about the history of weather events, or want to create a real-time feed for where you live.
If you’ve spent any time reading the round-up of 2018 technology predictions, you’ve likely seen Artificial Intelligence (AI) highlighted in nearly every one. The reason for this is because AI has a seemingly limitless number of applications and use cases for the enterprise. In fact, according to Gartner, over 85% of customer interactions will be managed without a human by 2020.
For those who are familiar with my previous Talend blogs perhaps you’ve noticed that I like to talk about building better solutions through design patterns and best practices. My blogs also tend to be a bit long. Yet you read them; many thanks for that! This blog is going to focus on methodologies as they apply to Talend solutions.