Big Data Analytic

Here is Gartner’s definition, circa 2001 (which is still the go-to definition)
:

Volume

The amount of data matters. With big data, you’ll have to process high volumes of low-density, unstructured data. This can be data of unknown value, such as Twitter data feeds, click streams on a webpage or a mobile app, or sensor-enabled equipment. For some organizations, this might be tens of terabytes of data. For others, it may be hundreds of petabytes.

Velocity

Velocity is the fast rate at which data is received and (perhaps) acted on. Normally, the highest velocity of data streams directly into memory versus being written to disk. Some internet-enabled smart products operate in real time or near real time and will require real-time evaluation and action.

Variety

Variety refers to the many types of data that are available. Traditional data types were structured and fit neatly in a relational database. With the rise of big data, data comes in new unstructured data types. Unstructured and semi structured data types, such as text, audio, and video, require additional pre processing to derive meaning and support metadata.

Data has intrinsic value. But it’s of no use until that value is discovered. Equally important: How truthful is your data—and how much can you rely on it? Today, big data has become capital. Think of some of the world’s biggest tech companies. A large part of the value they offer comes from their data, which they’re constantly analyzing to produce more efficiency and develop new products. So how are enterprises using big data today? Here are most popular big data use cas

Big Data Use Cases

  • Drive Innovation
  • Product Development
  • Predictive Maintenance.
  • Business strategic decision
  • Operational Efficiency
  • Machine Learning
  • Market research
  • Customer Experience
  • Fraud and Compliance