In some scenarios, there may be a need to collect logs from applications that do not use traditional methods such as the Windows Event Log or Syslog for Linux systems to write information, and any errors. Log Analytics allows us to collect these events in text file on both Windows and supported Linux distribution on the different.
The new entry written to the custom log Log Analytics are collected by each 5 minutes. The agent is also able to store what's the last entry collected in such a way that even if the agent stops for some time no data will be lost, but when he comes running resumes processing from the point where you left off.
In order to collect the log files, the following requirements must be met using Log Analytics:
- The log must have a single entry for each line of the file, or each entry must begin with a timestamp that meets one of the following formats:
- YYYY-MM-DD HH:MM:SS
- M/D/YYYY HH:MM:SS AM/PM
- Mon DD,YYYY HH:MM:SS
- YYMMDD HH:mm:SS
- ddmmyy HH:mm:SS
- MMM d hh:mm:SS
- Dd/MMM/yyyy:HH:mm:SS zzz
- The log file must not be configured to be overwritten with circular updates.
Defining a custom log
In order to collect the information of the custom log you must follow these simple steps.
- Open the wizard of custom Log:
- Log into OMS
- Settings – Date
- Custom Logs
- Add +
By default all changes that have been made in section Custom Logs are sent automatically to all agents who. For Linux is sent a configuration file to the data collector Fluentd. If you want to manually edit this file on Linux you need to remove the flag "Apply below configuration to my Linux machines".
- Upload and parse a log example:
Select the method that should be used to delimit each record of the file. Default is proposed to delimit the file by rows. This method can be used when the log file contains a single entry for each line of the file. Alternatively, you can select the Timestamp to delimit each record in the log file if it starts with a timestamp in a supported format. If the Timestamp is used to delimit the various records the "TimeGenerated" of each record stored in the who will be populated with the specified date and time in log file. If you are using the alternative method (New Line) the "TimeGenerated" is enhanced with the date and time of harvesting the value of Log Analytics.
- Add the log path to collect:
- Select Windows or Linux to specify the format of the path should
- Specify the path and add it with the button +
- Repeat the process for each path to add
When you insert a path you can also specify a value containing a wildcard in the name, useful to support applications that create new log files each day or to achieve a certain size.
- Assign a name and description to the configured log.
The suffix _ CL default is added.
- Validate the configuration.
When Log Analytics began collecting the custom log (You may have to wait until 1 now from the moment of activation this first data) You can consult them by accessing the who Portal Log Search. What Type You must specify the name assigned to the custom Log (example Type = nginx_error_CL).
After configuring the collection of custom log (each entry is saved as RawData) You can make parsing each record within the log into individual fields using Custom Fields present in Log Analytics. This allows us to analyze them and to search more effectively.
Once Log Analytics is a powerful and flexible solution which allows us to collect data directly from custom log, for both Windows and Linux machines, all by following simple, intuitive guided steps. For those who wish to learn more about this and other features of who I remind you that you can try the OMS for free.