Systems | Development | Analytics | API | Testing

Latest Blogs

In AI we Trust? Why we Need to Talk about Ethics and Governance (part 1 of 2)

Advances in the performance and capability of Artificial Intelligence (AI) algorithms has led to a significant increase in adoption in recent years. In a February 2021 report by IDC, they estimate that world-wide revenues from AI will grow by 16.4% in 2021 to USD $327 billion. Furthermore, AI adoption is becoming increasingly widespread and not just concentrated within a small number of organisations.

Enabling distributed NLP research at SIL

In my main position, as a data scientist at SIL International, I work on expanding language possibilities with AI. Practically this includes applying recent advances in Natural Language Processing (NLP) to low resource and multilingual contexts. We work on things like spoken language identification, multilingual dialogue systems, machine translation, and translation quality estimation.

Push to production pipelines and JMeter

This post does not look at a particular aspect of JMeter nor does it give a detailed overview of how to use a particular tool that will compliment your performance testing with JMeter. What it is about is the principles of push to production pipelines and performance testing and while I have stated that this post is not specifically about JMeter in my experience JMeter is one of the best performance testing tools for this type of pipeline integration.

Performance Testing Web Sockets with JMeter

In this post we are going to look at WebSockets, specifically how JMeter can be used to test them. Web Sockets are not supported natively by JMeter but there are a couple of Plugins that you can use that work very nicely. One of them is called JMeter WebSocket Sampler by Maciej Zaleski and information on the library can be found here. The second and the one we will use for our post is also called JMeter WebSocket Sampler and is by Peter Doornbosch, more information on this Plugin can be found here.

7 Ways to Improve Node.js Performance at Scale

Performance is one of the most important aspects of web application development. A fast application will make its users, developers, and business stakeholders happy, while a slow one is sure to frustrate all three parties. In this article, we will consider some practices that you should adopt to scale your Node.js servers. Your servers will then be able to handle high traffic workloads without a degraded user experience.

Save your engineers' sleep: best practices for on-call processes

Many technology companies have an ongoing commitment to their customers to guarantee reliability and uptime, with service level agreements that guarantee to resolve or escalate incidents within a particular time frame. Engineering team members rotate shifts so that someone is always on-call to be "paged" (these days, not using an actual pager) if an issue arises. Being on-call means that you typically need to be responsive even outside of your usual office hours.