Development Tools 7 min read

Big Data Performance Testing: A Look at the Bigger Picture

Written on 18 Jul 2017
Overview

Every growing business looks forward to the time when the performance of processing, testing, and pulling important insights out of mass quantities of data catches up with the huge amount of data collected on the web. It might seem that the challenge that big data imposes is far too complicated and overwhelming to meet, however, with the right strategy and tools at your disposal, you can manage this process relatively smoothly.

‘Big Data’ is currently one of the top buzzwords in the information and marketing space, even one of the top trends for information technology in general. Not very long after the World Wide Web started seeing commercial success, we started milking as much data out of it as possible, hoping that in the future one could more easily process and make sense of it all.
Enter that buzzword, ‘Big Data’, which refers to the relatively new and quickly growing industry centered around managing and processing this data. It seems everyone has heard of it to the point of exhaustion, and everyone is racing to keep their business at the forefront of Big Data technology, even if they don’t understand it and are just blindly throwing money at consultants who do.
An ever increasing number of businesses are looking to streamline the collection and processing of this data, hoping for it to offer great opportunities for growth. To take advantage of the potential that big data has to offer, one must have a strategy for setting up the complex infrastructure required.
NoSQL
These days, the opportunities and growth challenges that come with data process engineering are three-fold: aggregating the volume (Increasing the amount of data available), increasing the speed of data that comes in and out (velocity), and amassing the variety of data types as well as sources. We can call this the 3V model for volume, velocity, and variety.
Big data performance testing touches on how well the system performs in order to churn out data that is useful to the business, and not just managing the integrity and complexities of data itself. Much of one’s investment should be applied on framework performance engineering, failover, and data rendition.

Strategies Necessary for Performance Testing

It is important to prioritize architectural testing. Systems that are inadequate for the volume or type of data coming in, or are simply poorly designed have a high probability of resulting in inadequate performance or performance degradation over time. Below are 4 key points to focus on for performance testing for Big Data systems.
  1. Data ingestion – The process in which data is ‘absorbed’ or ingested into the larger system. This includes both data that is meant for immediate use as well as data meant to be archived or ‘warehoused’.    
    The focus here should be on routing the files or entries to their right destination in a timely manner. The process of ingesting and sorting data should be able to exceed the rate at which data is collected, with consideration for peak times. This part of the process is about ingestion and sorting, not validation.
  2. Data processing – This data that is gathered from many sources will need to be deduplicated, aggregated, and often de-anonymized depending on the use of said data. In Big Data driven targeted marketing, even without uniquely identifying information, one can take sets of data from different sources and connect information about individual prospects to a high degree of certainty with just a few complementary data points, allowing a profile to be built based on all the various sources in which the data was originally from.
    This process of data mapping is heavily varied depending on the framework and overall methodology of the operation. In a way, this is a large part of the ‘secret sauce’ of any Big Data operation, and strategies for this step are the part in which businesses wish the beat the competition on.
    The data is often processed in batches, and the system needs to be able to offer reliability and scalability- after all, the amount of data we collect is ever increasing, and it’s not like you’re going to dump most old data. The systems here may require ‘unique’ infrastructure to run complex operations on later, as there’s often a lot of parallel processing involved. Types of GPU based servers that were originally a niche area for things like video rendering and scientific research are now used for Machine Learning and AI as well (which is a driving point of big data). Infrastructure for this type of computing is often more expensive as while the rate of improvement for CPUs has dropped drastically, GPU technology is much closer to the rate of improvement of Moore’s law. This means the service lifetime of a GPU cluster is considerably lower before it becomes not worth the space and power cost of running it.
  3. Data persistence – The next area to focus on is the way the data is structured and archived. There are many options for data storage- ‘data marts’, management systems for relational database storage, data warehousing, and more. The key here is consistency and adaptability.
    You’ll essentially just be throwing data on a pile, and it will keep growing. Finding a scalable solution that’s cost effective while maintaining whatever speed of access you may need is important. What if you need something in the middle of that pile and you need it right now? It better not be on tape storage or Amazon Glacier! Often, Big Data solutions involve a tiered system, ranging from ‘hot’ to ‘cold’ depending on how often and how quickly the data needs to be accessed. After all, solutions for storage of data that can be retrieved at high speed or with high amounts of random operations is much more expensive than slower, denser, and less flexible data archiving solutions.
  4. Reporting and analytics – This is the process of pulling insights from the processed data, the other part of the ‘secret sauce’ of Big Data. After data has been processed, methods will be applied to attempt to uncover hidden patterns, correlations, emerging market trends, or really any type of information that is useful to the business undertaking the Big Data operation. The focus will be on applying the right algorithms, often involving the latest in two other buzzwords- Machine Learning and AI. This is where the parallel computing infrastructure mentioned before is important- the reporting and analytics section is more about the methods applied to the data, and the Data processing infrastructure must be adequate, and ready for the rapid changes in the methodology of the reporting and analytics.

How to Approach Performance Testing

Because the data is highly complex (dealing with large volumes of unstructured and structured data), one should keep the following things in mind when testing:
  1. Data insertion rate – The rate at which data can be ingesting into the system, and the rate at which is it moved- mostly between different parts of the overall system
  2. Query Performance – The processing speeds of queries and data retrieval from systems
Since the system is made up of multiple components, it is important to conduct testing in isolation and start out at component levels prior to testing them all at the same time. Performance testers must be well-versed in the frameworks and technology of Big Data. This will involve the application of market tools, including:
  1. Yahoo Cloud Serving Benchmark, or YCSB – A client for cloud service testing that can read, write, and update depending on specific workloads
  2. Sandstorm – a tool for automated performance testing in order to support testing for big data performance
  3. Apache Jmeter – A tool that provides the necessary plug-ins for test out databases
Hire Database Developers
It may seem like big data performance testing is challenging, but having the right tools, strategies, and expertise at your disposal will definitely allow you to manage everything smoothly! If you venturing into a project which takes advantage of Big Data tools, and you are looking for some outside expertise, CodeClouds has taken on this type of project before, and we can help you get on the right track.

Share this article

676 reads
Contents

Similar Reads