Big Data analytics: what’s old is new again

Guest Post by Kevin McConnell, Analytics Solutions & SI Alliances Global Leader at HP Software, Analytic Industry Solutions at Vertica Systems and Jeff Healey, Director of Product Marketing, HP Vertica

 

Big Data Analytics.jpgMany consider Big Data analytics to be a new paradigm. In reality, the analytics of massive amounts of data has been in practice for years, particularly in the financial services, communications and manufacturing industries. Interestingly, one of the early pioneers was UPS, which used analytics in the '50s to improve operations.

 

At the turn of the century, the Internet kicked off the second wave of Big Data analytics. Communications companies needed to better understand network traffic to plan for growth and management. The department with the largest consumption in this era was marketing. Consumer-focused organizations needed to leverage all of the behavioral nuggets about their clients.

 

The explosion of intelligent mobile devices and the current online social phenomenon have taken data volumes to unprecedented levels. To keep up, organizations have explored using sample data and limiting the length of historical data in their data sets. Many companies discovered that they were losing revenue due to the inaccuracy caused by these shortcuts. They needed to find ways to include as much data as possible, incorporate web/mobile interactions in real time and provide analytic results faster to create offers at time of engagement.

 

The Evolution of Big Data Analytics


Technology has evolved to address this issue. The analytics market has matured, creating more choices than there were a decade ago. SAS still has the predominate share but sees competitors encroaching on the market that they created. There are many database options that customers have deployed to improve the performance of these analytics. These options did a great job of improving query times, but they did not improve the overall performance of the entire process.

 

Leading firms today have realized that a large portion of the work required is in the preparation of data. There are many tools to help with data preparation, but breaking up the process  increases the expense and transfer time. The secret  to Big Data performance is in the preparation in the database. A solid database provides flexibility to create data sets optimized for the required analytics as the requests are executed. Some refer to this as ELT vs ETL (Extract, Transfer, & Load). ELT reduces the preparation time as well as the analytic processing time. Applying analytics to data sets optimized for the task at hand allows you to focus processing on only the relevant data and, thus, more of the relevant data.

 

Key clients have realized competitive advantage by implementing this process, achieving greater accuracy and faster results while simultaneously saving millions of operational expenses.

 

To find out how you can take advantage of Big Data Analytics, visit our hompage here.

Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the Community Guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Search
Showing results for 
Search instead for 
Do you mean 
About the Author
This account is for guest bloggers. The blog post will identify the blogger.
Featured


Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.