Configuration > Log Analytics 

Log Analytics

Introduction

Log Analytics helps you explore, analyze, and visualize logs easily on the vuSmarMaps platform. With its intuitive interface and advanced querying capabilities, Log Analytics enables users to efficiently retrieve and analyze log data, identify trends, and gain actionable insights from the vast volume of log data.

Getting Started

Accessing Log Analytics

The Log Analytics page can be accessed from the platform left navigation menu by navigating to Observability -> Log Analytics.

When you click on Log Analytics, you will be directed to the following landing page.

User Interface Overview

At the top right corner, you’ll find the global time selector, allowing you to gain insight into the logs at a specific time range selected.

Besides the time filter, there’s a live data toggle button that, when enabled, provides the user with the real-time display of the most recent 500 records, automatically refreshing every 10 seconds.

Below are the action buttons, each serving a specific purpose:

  • New: This button allows you to initiate a new search.
  • Open: Use this option to access previously saved searches.
  • More Actions: Here, users can save the search, grant permissions, and create a clone of the search

The search query field allows you to input phrases or keywords to retrieve specific logs from the Table.

On the left-hand side,

1. Data Store allows you to choose the particular Data Store

2Table option allows you to choose from where you want to fetch the logs.

The Data Stores are database connection instances configured in the data modeling workspace. Each Data Store defines a separate connection to a database instance, either internal or external, using which data analysis can be done. VuNet’s HyperScale database instances are the only types of Data Stores supported for log analytics. The default internal HyperScale Data Store instance will be available in all systems for use in log analytics.

Once the Data Store and Table are selected, the log analytics module provides a listing of matching logs for the time selected, without any other filters.

💡Note: To ensure the table appears in the Log Analytics table listing, certain fields must be present within it.

3. The Columns option on the left enables you to choose which columns are displayed in the Table. By default, the ‘timestamp’ and ‘message’ fields are selected on the Table.

In the above example, Columns options also include the following fields.

  • Module
  • Log level

💡Note: Column options may vary depending on the Data Store and selected Table. A maximum of seven columns can be selected at a time.

On the right-hand side, the trend chart and log table are displayed based on the selected Data Store, Table, and time range. This enables you to visualize trends and analyze log data precisely tailored to the chosen parameters, enhancing the effectiveness of the data exploration and analysis within the platform.

Exploring Log Data

When accessing Log Analytics, you’ll have the choice to either select the Data Store or configure O11ySources. Notably, only O11ySources configured with timestamps and tables will allow users to visualize the log data.

On clicking the 011ySource, you will be redirected to the following page.

Here, you can configure the O11ySource. To learn how to configure the O11ySource, refer to this page. Once configured, you can select it from the Data Store dropdown menu.

To visualize the log data, start by selecting the Data Store and Table you intend to analyze from the dropdown menu, then choose the time range.

For example, let’s select the time range of the Last 90 days, a Data Store as CH (Clickhouse), and a Table as vuSmartMaps_logs_data.

Upon selection, the trend chart and table will appear on the right-hand side of the screen.

Alternatively, if we toggle the live data it will showcase the most recent records.

Visualizing Logs and Patterns

If we look closely at the trend chart, at the top it represents the total number of logs present in the table in the selected time range i.e., Document Count Trend (Approx 68 Lakhs). The Area Chart component enhances the ability to dissect data directly from the graph.

Under the Columns section, select the desired columns to be displayed on the table, for example, we are selecting the “module” and “log_level”. On the right-hand side, it will be reflected on the table.

In addition, columns can be easily rearranged by dragging them once they have been selected.

To check the most frequent record of a particular field in the column, click on the horizontal ellipsis associated with the specific field listed in the Columns section.

On clicking, it will showcase the top five values in that column. For instance, if we click on the module field, it will display the top five values in the column, each value representing the numerical value indicating its frequency along with the total percentage of that value acquired in the column.

💡NoteColumn section expansion can be done for all the fields except message because it is unique, descriptive, and unquantified.

You can utilize both the filter and filter out options for any specific value to refine the values within the column.

Filter for value: This allows you to show the filtered value in the columns. For instance, if you want to filter vuAlert in the module field, you can click on the Filter for value option.

Filter out value: This feature enables you to display all values except the filtered one in the columns. For example, if you wish to exclude ‘vuAlert’ from the module field, you can select the Filter out value option.

Field Filters

The ‘Add filter’ option located at the top of the table streamlines the search for specific logs. For instance, you can search by log_level to filter for Error logs.

Hence, it displays all the logs corresponding to the selected filter.

Using Text Queries in the Search Box

Text-based query syntax can be used in the search box to interact with the system for analytics.

Users can type in any text pattern in the search box and the system will display matching log patterns.

The search box is designed to provide suggestions based on the user’s search history. It will display the ten most recent searches to assist users in finding relevant information quickly.

In addition, the platform supports additional query syntax including compound expressions to interact with the logs.

Quick Reference Table for Log Queries

Example

Description

Token Search

 

error

Search for log messages with the token ‘error’ present. This is case insensitive search

Multiple tokens – AND Condition

 

Error Kubernetes

Error + Kubernetes

Search for log messages with token ‘error’ and ‘Kubernetes’ present.

This is case insensitive search

Multiple tokens – OR Condition

 

Error | Kubernetes

Search for log messages with the token ‘error’ or ‘Kubernetes’ present.

This is case insensitive search

has(err)

Search for log messages with the string ‘err’.

This is case insensitive search

starts(err)

Search for log messages that start with the string ‘err’.

This is case insensitive search

ends(err)

Search for log messages that start with the string ‘err’.

This is case insensitive search

Case sensitive search

 

case(‘Error’)

case(‘Error’) | case(‘Warning’)

Case-sensitive search. The ‘Case’ operator can be combined with any of the other operators

Compare field value

In the below example, the field used is ‘module’.

 

module:=1

module:[1:4]

module:>1

module:<1

module:>=1

module:<=1

Search log messages that satisfy the condition specified for the field

Composite search

 

starts(‘Err’) + “Web” + host:”1.1.1.1”

Composite queries can be constructed using any number of individual operators

Searching multiple keywords

Using logical ‘and’ or ‘or’ operators to search for multiple keywords

error + Server

The above will find all log lines that have ‘Error’ and ‘Server’ in them.

Error | Service

The above will find all log lines which have either ‘Error’ or ‘Service’ in them.

Negate the Operation

The meaning of any search operation can be negated by using ~ operator (tilde)

~Error

Will find all logs not having ‘Error’ as a token

Case Sensitive Search

By default, all log searches are done without considering the case. Use case keyword to do a case-sensitive search

case(First)

Will find all logs not having ‘First’ as a token. This will not match first, FIRST, etc.

Searching for logs containing a string

Use the has keyword for a substring match.

has(Err)

Will find all logs having the string ‘Err’ in the log message.

💡Note: The vuSmartMaps platform tokenizes log messages to build an index for efficient text search. The tokenization is done by splitting the message based on space and special characters. By default, search operations look for matching tokens. If a substring search or complex phrase search is required by users, appropriate keywords are to be used.

Searching for logs with a value in a field

Searching for matching entries with a keyword in a particular column in the log table is done using the <field>:<value> syntax.

module:vuAlert

Will find all records in the log table where the module field has the value vuAlert.

💡Note: The vuSmartMaps platform stored the raw log content in the field named message. All search operations by default do their matching in the message field. If matching is required in a field other than message, <field-name>:<search-pattern> syntax is to be used.

Searching for logs containing a string in a field

Use the has keyword for a substring match.

module:has(vu)

Will find all logs having the string ‘vu’ in the module field.

Searching for logs starting with a string in a field

log_level:starts(ERR)

Will find all logs starting with the string ‘ERR’ in the field level

Searching for logs ending with a string in a field

log_level:ends(ING)

Will find all logs ending with the string ‘ING’ at the field level.

Search for logs having one of the specified values in a field

log_level:in(ERROR,WARNING)

The above will find all log lines having a value of severity as either Error or Warning.

Advanced Log Search Queries with Number Field

Searching for logs with a value of the number field matching the condition
row:=110895780

Searching for logs with the value of the number field falling in a range
row:[110188861:110188865]



Similarly, you can also try for – 

usage:<10, usage:>=10, and usage:<=10

Negate the operation in case of field value search

The tilde (~) operator can be used to negate any operation done at the field level.

~module: vuAlert

This will find all log lines in which the severity field does not have a value Error.

Case-insensitive search on a field

By default, all field value matching is done in a case-sensitive manner. If case insensitive search is required, use the has keyword.

log_level:has(war)

Searching for log records in which a field has a non-null value

module:exists

Will find all log lines in which the module field is not empty.

Surrounding logs

When clicking the Action button for a specific log, it will display logs surrounding the selected log. Ten chronologically previous log lines and ten subsequent log lines are displayed. These surrounding logs aid in understanding the context of the logs generated by the system around the time the selected log was produced.

The surrounding logs are located by:

  • Removing any filters applied in the search box
  • Locating 10 log lines chronologically preceding and succeeding the log line selected
  • While locating the surrounding log lines, the system preserves any table-level filters applied

Saving and reusing a Search

To save a specific search, click on the More Actions button in the top right corner, and then select Save.

When you click the Save button, a pop-up will appear. Add the name and description (optional), and you can toggle Store time with Saved Search, which saves the search with the selected time. Whenever that search query is opened, it automatically sets the saved time range by default and loads the logs for that time range.

💡Note: When Live data is active and the ‘Save Time Range’ option is enabled, it will automatically save the data for the previous 15 minutes.

Click on Save To access the saved search, click on the Open button in the top right corner.

On clicking the Open button, a pop-up will appear where you can access all the saved searches. You can also directly delete the saved search from here.

Now, proceed to select the desired saved search listed in the table.

💡Note: The same Saved Searches can be used to create the Data Model. Please refer to the data model section for more information.

Access Permissions of Saved Search

While saving a search, access permissions can be assigned to different user roles to control the set of users having a view or modify permission to the saved search. To grant permission for the saved search, navigate to More Actions > Permissions.

For every role, you can give 3 types of permission

  • View: The selected user can only view the Reports.
  • Modify: The selected can also modify and make changes to the Reports.
  • None: There are no permissions given.

Exporting Saved Search

To export the saved search, click on the export button.

On clicking the export button, a pop-up will appear where you can download the logs in CSV format with a maximum limit of 5000. Alternatively, you can also select the log limit to 100, 500, or 1000. Once done, click on the download button to download it to your local system.

Conclusion

Log Analytics is a powerful tool on vuSmarMaps that helps you manage and understand your logs better. It’s made to suit all kinds of users and has great search options, easy-to-use visualizations, and simple ways to handle your data. With Log Analytics, you’ll be able to get the most out of your log data and make smarter decisions.

Resources

Browse through our resources to learn how you can accelerate digital transformation within your organisation.

Unveiling our all powerful Internet and Mobile Banking Observability Experience Center. Click Here