Docs > Data Ingestion and Processing > ContextStreams > Managing ContextStreams
Click on the I/O stream name to view the particular I/O stream details.
There are 3 tabs under the ‘View’ option (i.e., I/O Streams Info, Readers, Metrics).
This tab provides comprehensive information about your I/O streams. By clicking on it, you can access details such as stream configurations, data sources, and any transformations applied. Refer to the below Snapshot for a visual guide.
The ‘Readers’ tab lists configured readers associated with the respective I/O Stream. It offers insights into how data is consumed from this I/O stream. For a detailed visual guide, check the below snapshot.
Metrics are vital for performance analysis and optimization. Navigate to the ‘Metrics’ tab to access real-time performance metrics, ensuring efficient operation of your I/O streams. The tab displays crucial performance indicators for a specified 30-minute period, including ‘Data In’ and ‘Data Out’ metrics, offering insights into data flow within your streams. Additionally, the ‘Total Compressed Data’ provides a snapshot of compression efficiency.
‘Data In’ reflects the volume of incoming data ingested into the I/O stream in the last 30 minutes, indicating the data ingestion rate. ‘Data Out’ represents the amount of data processed by the stream in the same timeframe, signifying data processing efficiency. Monitoring fluctuations in both metrics aids in assessing overall data flow dynamics and system performance.
The ‘Total Compressed Data’ represents the cumulative size of compressed data processed by the streams, offering insights into compression effectiveness.
During debugging, the ‘Refresh’ button fetches real-time metric updates, providing an updated view of current performance and aiding in promptly identifying any issues.
Click on the Edit button to modify any details associated with the added I/O stream.
After clicking the Edit button, you will be directed to the editing page where you can make desired changes to the I/O stream configuration. Update the fields as needed, such as the description, partition settings, retention period, and other relevant parameters.
💡Note: For more detailed instructions on parameters to be provided in I/O Streams, you can refer here.
Once you’ve made the necessary adjustments, click the Save button to save the changes. Refer to the steps mentioned during the creation process for specific details on each field.
Select the I/O streams that you want to delete and click on the Delete button. Confirm by clicking on the Delete button. The selected I/O stream will be removed from the list.
Click on the pipeline name to view the particular pipeline details.
There are 4 tabs under the ‘View’ option. You can get the details by clicking on each of them. Following are the tabs:
Under Flows, the pipeline flow chart is visible, and you can perform the following action:
You can view the Pipeline information from here.
For more detailed instructions on parameters in Data Pipeline Settings, you can refer here.
In the Metrics tab, you can find a list of essential metrics providing insights into the performance of the data pipeline, particularly focusing on the data flow.
Metrics are vital for performance analysis and optimization. Navigate to the ‘Metrics’ tab to access real-time performance metrics, ensuring efficient operation of the data pipeline. The tab displays crucial performance indicators for a specified 30-minute period, including ‘Data In’ and ‘Data Out’ metrics, offering insights into data flow within the pipeline.
‘Data In’ reflects the volume of incoming data ingested into the data pipeline in the last 30 minutes, indicating the data ingestion rate. ‘Data Out’ represents the amount of data processed by the pipeline in the same timeframe, signifying data processing efficiency. Monitoring fluctuations in both metrics aids in assessing overall data flow dynamics and system performance.
During debugging, the ‘Refresh’ button fetches real-time metric updates, providing an updated view of current performance and aiding in promptly identifying any issues.
This tab displays logs related to the pipeline’s activities. To debug for deeper analysis, logs can be analyzed here. These logs provide a comprehensive view of the pipeline’s execution, helping identify and address issues effectively.
To Edit the pipeline, click on the Edit button located under the Actions column.
On clicking, you will be redirected to the following page.
Working with the Pipeline Editor is explained in detail here.
Another tab, while you are editing the pipeline, is the “Pipeline Info” under the edit section. You can configure the advanced pipeline settings from here. For more detailed instructions on parameters in Data Pipeline Settings, you can refer here.
Select the pipeline you want to delete and click on the Delete button. You can delete the pipeline either using the Delete button at the top or under the Action column.
Click on the Delete button to confirm the delete action.
The selected pipeline will be deleted and removed from the list.
There are 3 actions for all the connectors listed: View, Edit, and Delete.
By clicking on the Datastore connector name, it shows the connector details where you can also see the total Data (messages read and sent). There are two tabs under the view option:
Here, the Datastore Connector details are visible, and you can also configure the advanced settings options from here.
You can perform the following action:
Navigate to the ‘Metrics’ tab to access a comprehensive list of metrics for your DataStore Connectors.
Metrics are vital for performance analysis and optimization. Navigate to the ‘Metrics’ tab to access real-time performance metrics, ensuring efficient operation of the DataStore Connector. The tab displays crucial performance indicators for a specified 30-minute period, including ‘Data In’ and ‘Data Out’ metrics, offering insights into data flow within the streams.
‘Data In’ reflects the volume of incoming data ingested into the connector in the last 30 minutes, indicating the data ingestion rate. ‘Data Out’ represents the amount of data processed by the connector in the same timeframe, signifying data processing efficiency. Monitoring fluctuations in both metrics aids in assessing overall data flow dynamics and system performance.
During debugging, the ‘Refresh’ button fetches real-time metric updates, providing an updated view of current performance and aiding in promptly identifying any issues.
The ‘Edit Connector’ option enables you to configure specific settings and parameters for your connector. Make the desired changes to the configurations and click Save to apply them.
For detailed information on the fields and their configurations, please refer to the steps mentioned during the creation of the connector (ES Sink Connector, and JDBC Sink Connector).
Select the Datasource Connector you want to delete and click on the Delete icon.
Click on the Delete button to confirm the delete action.
The selected DataSource Connector will be deleted and removed from the list.
Flows serve as a dynamic visual representation, elucidating the intricate data flow within the system. It vividly illustrates the path data follows, originating from the source data, traversing through collection agents, and concluding in permanent storage.
In this section, the left panel serves as a tool for selecting and configuring the elements you want to visualize on the right panel. Utilize the options for Input Data Stream, Pipeline Data, Output Data Stream, and DataStore Connector to tailor the displayed flows based on your preferences and requirements. This configuration allows you to optimize the visualization and focus on the specific aspects of your data flow journey.
Flows empower you to visualize, adapt, and optimize the data’s journey from its origin to permanent storage, enhancing your understanding and control over the data processing pipeline.
Browse through our resources to learn how you can accelerate digital transformation within your organisation.
VuNet’s Business-Centric Observability platform, vuSmartMaps™ seamlessly links IT performance to business metrics and business journey performance. It empowers SRE and IT Ops teams to improve service success rates and transaction response times, while simultaneously providing business teams with critical, real-time insights. This enables faster incident detection and response.