Jan-Business_report_1-min

ContextStreams

Collect, contextualize and cast your raw data into actionable real-time business intelligence.

ContextStreams is the domain-centric Observability pipeline in vuSmartMaps™ that allows you to ingest data at high speeds from a variety of sources. The data is then transformed by adding rich context – business, domain, semantic, syntax, and state while ensuring compliance with industry standards, through dynamic data pipelines and push them out to external storage for building actionable business insights.

Why ContextStreams?

Icons_Collect

Collect

Ingest data at high speed from multiple data sources. You can leverage agents, pollers, APIs and O11ySources (our preconfigured observability blocks for infrastructure) without worrying about data format or structure.

Icons_Cast

Contextualize

In our data pipelines you can transform, enrich and filter your data with our out-of-the-box plugins. More importantly, leveraging our domain-centric adaptors transform your logs, metrics and traces into intelligent insights.

Icons_Contextalize

Cast

Leverage our data store connectors to integrate the transformed data with destinations including vuSmartMaps™ HyperScale Data Store, TimescaleDB, Cloud or a downstream data store of your choice.

Fig: Interactive Diagram of ContextStream

Components of ContextStreams

It is a distributed message hub for the ingested data, that is processed in real-time via data pipelines.

The Data Pipeline serves as the backbone for data transformation within ContextStreams. The data from input streams are transformed with multiple plugins and domain-centric adaptors.

  • Plugins – These are out-of-the-box pre-built pipelines that enrich, transform, correlate, filter, split your data and more. Our library of plugins ranges from regular data transformation operations like time conversion, field enrichments, arithmetic operations, and metrics aggregations to complex actions like micro-transactions stage tracking, and dynamic transaction ID-based correlation of events.
  • Domain-Centric Adaptors – Built on business, domain and environmental context, domain-centric adaptors transform your data from a specific application to compliance operations to business functions (eg: adaptors that understand payment transactions, transaction stages, payloads, error cases, and error codes). More importantly, ContextStreams comes with an arsenal of 25+ domain adaptors for the Banking and Financial Services Industry.

The data pipeline offers multi-branching that allows you to construct sophisticated pipelines.

  • Linear pipeline – A straightforward, sequential arrangement of plugins.
  • Branch pipeline – Offers the flexibility to generate multiple processed outputs from a single input data stream.
  • Multi-input pipeline – Enables the integration of various inputs into a single pipeline, directing them to a unified output stream.
  • Multi-output pipeline – Facilitates the entry of multiple inputs into the pipeline, capable of branching out into several outputs.
  • Complex pipeline – A network of multiple blocks that branch and merge, intricately designed to execute specific business tasks.

Queues, processed data from the data pipeline and interacts with data store connectors.

DataStore Connectors allows you to integrate data to vuSmartMaps™’s HyperScale Data Store, Cloud, TimescaleDB or a downstream data store of your choice.

The Flows feature provides a comprehensive visual representation of the data’s journey within the ContextStreams.

Features

  • Seamlessly ingests data at scale in real-time and at incredible speeds.

  • Handles events in the order of millions of events per second owing to resources efficient multi-threading and horizontally scalable architecture.

  • Effortlessly verify and validate streamed data with the “Preview” feature. The data pipelines offer tailored and granular debug options for every stage — whether it is a published pipeline, a draft, or a specific block.

  • Ensures no data loss during ingestion and transformation by decoupling data connectors from the Data Pipeline, preventing data compromise due to endpoint failures.

  • Transform data through filtering, masking, mathematical operations, domain-related extraction, enrichment, and interpretation through pre-defined and custom plug-ins.

  • Achieve cost and storage efficiency by transforming logs into metrics.

Fig: Bento Diagram of ContextStream

RELATED CASE STUDIES