We live in a world of data. This data is generated by transactions, feedback, and real-time interaction with customers, partners, suppliers, and employees.
Here are the 5 V’s of big data:
- Volume: This refers to the large amount of data generated every moment. Think of all the emails, Twitter messages, photos, video clips and sensor data that is generated and shared every second. Data is not just in terabytes, but zettabytes or brontobytes of data is generated. On Facebook alone people send 10 billion messages per day, click the like button 4.5 billion times and upload 350 million pictures each and every day. Taking all the data generated in the world between the beginning of time and the year 2000, it is the same amount that is now generated every minute. This keeps making data sets too voluminous to store and analyze using legacy and old database systems. With big data technology one can now store and analyse these data sets with the help of distributed systems, where parts of the data is stored at different places, connected by networks and brought together by Big Data software.
- Velocity: This refers to the speed at which data is generated and the frequency at which data moves around. Think of social media messages going viral in minutes, Frequency at which credit card transactions are checked for fraudulent activities or the milliseconds taken by trading systems to analyze social media networks to interpret signals that trigger hints to buy or sell shares. Big data tools allows to analyze the data while it is being generated without the need of first putting it into databases systems and then analyzing it.
- Variety: This means the different types of data we can use now. In the past our major focus was on structured data that properly fits into tables or relational databases such as financial data (for example, sales by product or region). 80 percent of the world’s data is unstructured format and therefore can’t easily be put into tables or relational databases—for example photos, video sequences or social media updates. With big data tools we can now harness differed types of data like messages, social media conversations, photos, sensor data, video or voice recordings and bring and analyze them together with more traditional, structured data.
- Veracity: This means the messiness or trustworthiness of the data. With many forms and types of big data, quality and accuracy are less controllable, for example Twitter posts with hashtags, typos and colloquial speech, abbreviations etc. Big data and analytics tools allows to work with these types of data. The volume often causes for the lack of quality or accuracy, but entire volume of fast moving data of different variety and veracity have to be turned into value. This is the reason why value is the one V of big data which matters the most.
- Value: This means our ability to turn our data into value. It is really necessary that businesses make a case for any attempt to collect and leverage big data. It is easy to fall into the buzz trap and start embarking on big data initiatives without a clear knowledge of the business value it will bring.
There are 3 reasons why we are generating data faster than ever:
- Processes are increasingly automated
• Systems are increasingly interconnected
• People are social and continuously generate data exhausts by interacting online
Data, in general, falls into 3 categories-
- Business application data (e.g., SAP or Oracle ERP)
- Human generated data (e.g., social media) and
- Machine data (e.g., RFID, Log Files etc.).
In addition to this data, click and mobile business app based transactions, Human generated data — explosive growth of blogs/reviews/messages/emails/pictures. The Twitter alone generates more than 7 terabytes — 10s of millions of tweets per day and is growing rapidly. Facebook is estimated to generate more than 10 terabytes a day. Social graphs such as product recommendations based on circle of friends, jobs you may like (linked in), the products you have looked at, people who are your contacts etc.
Big Data Analytics is used in almost every domain. Here are few examples
- Predictive Maintenance using sensors — High-end cars use telemetry to know that an engine part is likely to break down before it actually does, based on the vibration or temperature patterns, a technique known as predictive maintenance. The idea is that a part does not fail all at once. Instead, it deteriorates over time until it eventually breaks. By monitoring the part real-time, you can spot problems before they become obvious.
- Energy Management — Many firms are using big data for energy management, including energy optimization, smart-grid management, building automation and energy distribution in utility companies. The use case is centered around monitoring and controlling network devices, manage service outages, and dispatch crews. It gives utilities the ability to integrate millions of data points on network performance and lets engineers use analytics to monitor the network.
- E-tailing – E-Commerce – Online Retailing Use Cases
e-tailers like Amazon.com are constantly creating target offers to boost customer lifetime value (CLV); deliver consistent cross-channel customer experiences; harvest customer leads from sales, marketing, and other sources; and continuously optimize back-end process orchestrations.
- Recommendation engines — increase average order size by recommending complementary products based on predictive analysis for cross-selling.
- Cross-channel analytics — sales attribution, average order value, lifetime value (e.g., how many in-store purchases resulted from a particular recommendation, advertisement or promotion).
- Event analytics — what series of steps led to a desired outcome (e.g., purchase, registration).
- Right offer at the right time
- Next best offer– deploying predictive models in combination with recommendation engines that drive automated next best offers and tailored interactions across multiple interaction channels.
To know more about Big data analytics visit here