Docs > Configuration > Log Analytics 

Log Analytics

Introduction

Log Analytics helps you explore, analyze, and visualize logs easily on the vuSmarMaps platform. With its intuitive interface and advanced querying capabilities, Log Analytics enables users to efficiently retrieve and analyze log data, identify trends, and gain actionable insights from the vast volume of log data.

Getting Started

Accessing Log Analytics

  1. The Log Analytics page can be accessed from the platform left navigation menu by navigating to Observability -> Log Analytics.
  2. When you click on Log Analytics, you will be directed to the following landing page.

User Interface Overview

At the landing page of log analytics, you have the provision to select Data Store and Table to start the new analysis, Hyperscale is selected as default data store.

1. Data Store allows you to choose the particular Data Store

2Table option allows you to choose from where you want to fetch the logs.

The Data Stores are database connection instances configured in the data modeling workspace. Each Data Store defines a separate connection to a database instance, either internal or external, using which data analysis can be done. VuNet’s HyperScale database instances are the only types of Data Stores supported for log analytics. The default internal HyperScale Data Store instance will be available in all systems for use in log analytics.

Once the Data Store and Table are selected, the log analytics module provides a listing of matching logs for the time selected, without any other filters.

💡Note: To ensure the table appears in the Log Analytics table listing, it must contain the following fields: timestamp, message, log_uuid, and message_lower.

  1. Conversely, you have a option to load the previously saved search.

2. Once the Data Store and Table are selected you will be landed to the following page.

3. If you want to change the selected data store and table, click on the currently selected data store at the top left.

4. You will be redirected to the page where you selected the Data Store and Table. Now, make the desired selections and click the Start a New Analysis button.

5. At the top right corner, you’ll find the global time selector, allowing you to gain insight into the logs at a specific time range selected.

6. Besides the time filter, there’s a Live data toggle button that, when enabled, provides the user with the real-time display of the most recent 500 records from the last five minutes, automatically refreshing every 10 seconds.

7. Refresh button beside the Live data toggle enables data refresh, and the User Guide tab redirects you to the Log Analytics User Guide.

Below are the action buttons, each serving a specific purpose:

  • Save: Clicking the button opens a context menu with two options. For new search , the ‘Save’ option is disabled; you need to use ‘Save as’. For a saved search, use ‘Save’. In an already saved search, ‘Save As’ functions like a clone.
  • New/Load: This button allows you to initiate a new search or access previously saved searches.
  • More Actions: Here, users can grant permissions on saved search.

  1. The search query field allows you to input VQL-based text queries to retrieve specific logs from the Table.
  2. Clicking the ‘VQL’ button on the search bar opens the VQL help section documentation in a new tab.

3. The Build Filter option located next to the search box allows you to add filters. By clicking ‘Add Filter,’ you can select columns and operators to filter out specific logs in the table.

4. The Columns option on the left enables you to choose which columns are displayed in the Table. By default, the ‘timestamp’ and ‘message’ fields are selected on the Table.

💡Note: Column options may vary depending on the Data Store and selected Table. A maximum of twenty five columns can be selected at a time.

On the right-hand side, the area chart and log table are displayed based on the selected Data Store, Table, and time range. This enables you to visualize trends and analyze log data precisely tailored to the chosen parameters, enhancing the effectiveness of the data exploration and analysis within the platform.

Exploring Log Data

  1. To visualize the log data, start by selecting the Data Store and Table you intend to analyze from the dropdown menu, and click on Start a New Analysis button.

    • 💡Note: Instead of displaying raw table names on the log analytics pages, the system now displays user-friendly labels for better readability..

2. Now, you can choose the time range. By default, it will show data for the last 1 day.

    • For example, let’s select the time range of the Last 2 days.

3. Once these selections are made, the trend chart and table on the right-hand side of the screen will display values based on the chosen time range

4. Alternatively, if we toggle the live data, it will showcase the latest 500 records from the last 5 minutes.

Visualizing Logs and Patterns

  1. If we look closely at the trend chart, at the top it represents the total number of logs present in the table in the selected time range i.e., Document Count Trend (Approx 1 Lakhs). The Area Chart component enhances the ability to dissect data directly from the graph.

  1. Under the Columns section, select the desired columns to be displayed on the table, for example, we are selecting the “log level” and “module”. On the right-hand side, it will be reflected on the table.

3. In addition, columns can be easily rearranged by dragging them once they have been selected.

4. To check the most frequent values of a particular field in the column, click on the horizontal ellipsis associated with the specific field listed in the Columns section.

5. On clicking, it will showcase the top five values in that column. For instance, if we click on the module field, it will display the top five values in the column, each value representing the numerical value indicating its frequency along with the total percentage of that value acquired in the column.

💡NoteColumn section expansion can be done for all the fields except message because it is unique, descriptive, and unquantified.

6. You can utilize both the filter and filter out options for any specific value to refine the values within the column.

    • Filter for value: This allows you to show the filtered value in the columns. For instance, if you want to filter ‘vuAlert’ in the module field, you can click on the Filter for value option.

    • Filter out value: This feature enables you to display all values except the filtered one in the columns. For example, if you wish to exclude ‘vuAlert’ from the module field, you can select the Filter out value option.

Querying For Logs

There are  two methods available for searching and querying logs:

  1. Using VQL-based Text Queries in Search Bar: You can enter VQL (Vunet Query Language) based text queries directly into the search bar to retrieve specific logs matching your criteria.
  2. Using Filter Operations in Filter Menu: Alternatively, utilize the Build filter menu to construct queries using filter operations. This method allows you to incrementally build simple or compound queries, providing a robust interface for log analysis.

In both cases, these methods enable the creation of powerful queries that facilitate detailed log analysis.

Build Filters

  1. The Build Filter option located next to the search box allows you to add filters. By clicking ‘Add Filter,’ you can select columns and operators to filter specific logs in the table

💡Note:You can add multiple filters at a time by clicking on ‘Add Filter’ more than once. Additionally, even if the columns are not selected in the table, you can still run the search for the particular columns.

2. Please note that the log_uuid filter is no longer supported.

3. For instance, you can search a string-type field such as ‘module’, choose the operator ‘Contains’, specify the term as ‘Vusearch’, and then click on Apply.

4. The filter will be applied successfully and will be visible at the top left below the search bar.

5. You can add multiple filters simultaneously, and each filter will display alongside one another.

💡Note: If multiple filters are added simultaneously, they will operate as an AND operation. Similarly, if you add multiple filters and use VQL separately, the combined filters and VQL will also operate as an AND operation.

6. By clicking on a specific filter, you can modify its configuration, delete the query, or temporarily disable the query.

7. The Filter Actions button on the left allows you to enable, disable, or delete all filters directly.

Using Text Queries in the Search Box

  1. Text-based query syntax can be used in the search box to interact with the system for analytics.
  2. Users can type in VQL-based text queries in the search box and the system will display matching log patterns.
  3. The search box is designed to provide suggestions based on the user’s search history. It will display the ten most recent searches to assist users in finding relevant information quickly.

4. In addition, the platform supports additional query syntax including compound expressions to interact with the logs.

5. For more detailed information, please refer to the VQL page.

Columns Sidebar Toggle

The Columns toggle allows you to control the visibility of the sidebar. By default, the sidebar is visible when the page loads unless there is specific reason to hide it. The Columns toggle, which is enabled by default, lets users hide the sidebar by switching it off. This feature offers flexibility for those who prefer a more streamlined view of the logs without the column options sidebar.

Expand Logs

This allows multiline data, like log messages, to be displayed within cells. Users can expand rows to see specific lines of the log message, although messages longer than approximately four lines will be truncated.

Surrounding logs

  1. When you click the ‘Surrounding Logs‘ button in the Actions column for a specific log in the table, it displays logs that surround the selected log.
  2. This includes the hundred log lines chronologically preceding and following the selected log. Reviewing these surrounding logs helps in understanding the context of the system logs generated around the time the selected log was produced.

💡Note: Any applied filter will be automatically disabled when checking the surrounding logs.

3. The surrounding logs are located by:

    • Temporarily disabling any filters applied.
    • Locating 100 log lines chronologically preceding and succeeding the log line selected
    • While locating the surrounding log lines, the system preserves any table-level filters applied.

💡Note: If all applied filters are disabled when viewing surrounding logs, users can still mute/unmute existing filter pills by clicking on them.

4. Additionally, in surrounding logs, you can’t add or edit any filters. Please be aware that any changes to columns selected to be shown on the table from the left side will not be preserved when returning to the main page from the surrounding logs view.

  1. To save a specific search, click on the Save button in the top right corner, and then select Save As.

2. When you click the Save As button, a pop-up will appear. Add the name and description (optional), and you can toggle Store time with Saved Search, which saves the search with the selected time. Whenever that search query is opened, it automatically sets the saved time range by default and loads the logs for that time range.

💡Note: When Live data is active and the ‘Save Time Range’ option is enabled, it will automatically save the data for the previous 15 minutes.

3. Click on Save To access the saved search, click on the New/Load button in the top right corner and then click on Load Search.

4. On clicking the Load Search button, a section will appear where you can access all the saved searches. You can also directly delete the saved search from here.

5. Now, proceed to select the desired saved search listed in the table.

💡Note: The same Saved Searches can be used to create the Data Model. Please refer to the data model section for more information.

Access Permissions of Saved Search

  1. While saving a search, access permissions can be assigned to different user roles to control the set of users having a view or modify permission to the saved search. To grant permission for the saved search, navigate to More Actions > Permissions.
  2. For every role, you can give 3 types of permission

    • View: The selected user can only view the saved search (Save button will be disabled for these users).
    • Modify: The selected can also modify and make changes to the saved search based on the  object level permission granted.
    • None: There are no permissions given.

Sharing a Saved Search

To share a search view with others:

  • Save the search view by using the save function within the interface.
  • Other users can access this saved search view by logging in and navigating to the saved searches section.
  • Once accessed, users can view the saved search, ensuring collaborative access to important log insights.

Exporting a Search

To export the search, click on the export button.

On clicking the export button, a pop-up will appear where you can download the logs in CSV format with a maximum limit of 5000. Alternatively, you can also select the log limit to 100, 500, or 1000. Once done, click on the download button to download it to your local system.

💡Note: Only the data from the columns selected in the table will be exported in the CSV file. If data is sorted, the exported CSV will preserve that sorting as well.

Viewing Individual Log

For a detailed view of a particular log, click on the specific log you wish to view from the list of available logs.

On clicking, you will see the detailed view of that particular log. If any of the fields are empty, they will be shown under the empty field section within the log details drawer. Additionally, you can make a copy of the log and export it.

Conclusion

Log Analytics is a powerful tool on vuSmarMaps that helps you manage and understand your logs better. It’s made to suit all kinds of users and has great search options, easy-to-use visualizations, and simple ways to handle your data. With Log Analytics, you’ll be able to get the most out of your log data and make smarter decisions.

FAQs

To search for logs with specific keywords, use the search query field on the Log Analytics page. Enter your keywords, and the system will display matching logs. For more advanced search options, such as case-sensitive searches or composite queries, refer to the Using Text Queries section in the user guide.

To save a search query, click on the ‘More Actions’ button, select ‘Save’, and provide a name and description. You can access saved searches by clicking the ‘Open’ button. For step-by-step instructions, refer to the Saving and Reusing a Search section.

You can use the negate operator ~ to find logs that do not contain a specific keyword. For example, ~Error finds all logs not containing ‘Error’. Refer to the Negate the Operation section for more information.

Yes, use the has keyword for case-insensitive searches on specific fields. For example, log_level:has(war) finds all logs with ‘war’ in the log_level field. See the Case-insensitive Search on a Field section for more details.

Yes, you can assign access permissions to different user roles by navigating to More Actions > Permissions while saving a search. This allows you to control who can view or modify the saved searches. Detailed instructions are available in the Access Permissions of Saved Search section.

You can construct complex queries using logical operators (AND, OR) and specific field conditions. For example, error + Server finds logs containing both ‘Error’ and ‘Server’. Refer to the Searching Multiple Keywords section for examples.

The live data feature provides up-to-the-minute information, essential for monitoring critical operations in real time. This feature ensures that teams can respond swiftly to any emerging issues, maintaining system stability and performance.

By analyzing historical log data, Log Analytics can identify usage patterns and peak times, helping organizations plan for capacity needs and optimize resource allocation. This ensures that the system can handle load variations efficiently without over-provisioning.

Customizing columns allows users to focus on the most relevant data points, improving clarity and making it easier to spot trends and anomalies. This tailored view helps in conducting more precise and meaningful analyses.

Log Analytics improves the overall user experience by providing detailed insights into system performance and user behavior. This enables quicker issue resolution, better system optimization, and ultimately a smoother, more reliable experience for users on the vuSmartMaps platform.

Resources

Browse through our resources to learn how you can accelerate digital transformation within your organisation.

Quick Links