Splunk Enterprise — About Time

7 min readJun 14, 2022


I only made this blog in order to provide common Q&A information to anyone interested in using Splunk. It is also great as a reference. Please visit Splunk for the official learning courses


Ingested events its timestamp is stored in the _time field, it is used to create the event timeline in the Splunk Web user interface, The time field is stored with the event in the index prior to search time along with default fields such as host, source, and sourcetype, time stamps are expressed in Unix or epoch time and translated to human-readable time during the search operation process, All events are sorted by time thus time is the most efficient filter, The _time field’s timestamp is displayed underneath theTime column within each event and is adjusted to the local time zone as long as the time zone is set within the account settings, along with that every single event will have a specified host, source, and sourcetype as well as the index that that event is stored in,

earliest and latest are time modifiers that used plus or minus syntax to look forward or back in time in the specific time unit and can round down or snap down to a specific time unit using the @ symbol,

all events have a value for _time assigned to them, but not all source events have an actual timestamp value in them such as using sourcetype=ps because it will be pull up the process with no time stamp but instead will default to provide the time of when the process was indexed since there is no time in the event

you can place your data into a table using the bin command, it will bin up the time values each being Xhour apart then place the information on a table when followed by the stats command, you can then use eval to crete a new field and use the function strftime with arguments in the format you want to place them and then finally adding the table itself

the bin command has a span option that is set to an integer or a timescale that allows us to set the size for each bin, span can be used to specify a time interval for the bins

here it is searching across the games index sourcetype of SimCubeBeta looking for events that contain an action field across the last 10mins, and is grouping together the _time values each of those time bins being two minutes apart and the placing that information on a table getting a count of events and listing out the actions by time


eval writes the results of an expression to a new or preexisting field, you can chain multiple eval expressions together by comma-separating them,

the now function returns the time that a search was started, the time function returns the time that an event was processed by the eval command, those functions can be set as values of the fields that we’re creating it the eval command that will store those particular timestamps,

the relative _time function is a function that takes in two arguments the first argument being the actual time value, that time value will be represented in epoch time, the second argument being a specific time specifier using the time unit abbreviations, in the example a new field is created called yesterday setting it equal to the relative _time function, the first argument will be the time generated by the now function, the second argument will be based on the time right now, an will be liking back one day ago snapping to the beginning of the current hour, which ends with a time value that is represented in epoch time stored as a value for that field of yesterday,

strftime and strptime have date time unit abbreviations each one representing a different unit of time,

in the example above the epoch time generated by this relative_time function which is stored in the yesterday field is going to be formatted into a more readable format by this strftime function where the time value that’s stored in yesterday will be passed in as the first argument, second argument being in the format of the year, month, day, hour, and minute,

difference between the strftime and the strptime function is that this function here is going to take a time represented by a string to a Unix timestamp, going from a formatted value to epoch time, first argument will be in the field that contains those formatted values, second argument using the date time format variables in the order that the format is specified in for that first argument, and will result in a newly created field with those epoch time values,


| timechart — performs statistical aggregations against time, it is a transforming command that allows data plotting out over time with time always being in the leftmost column of our table and on the X axies of our visualization, it supports the same family of functions as the stats and chart command and has a built in span option that allows us to control the overall time spans of our time buckets,

time charts are best visualized as a line or area chart in order to get some information about how many usage violations there were per day

there is the option to span the timechart command that allows to specify a span with the time units,

example for products

| timewrap — used to display the output of the timechart command so that each time period is a separate series and typically follows a timechart command, it allows the comparison of specific time periods to each other

this example compares the number of password failures over the last week to password failures over the previous week,

setting the time to the last 19 days instead will produce 3 columns instead which will produce an additional line in the visualization as it now covers 3 weeks

Time Zones

always check data to make sure it is producing the proper results

the date_time fields do not take into account the time zone of Splunk Web,

use strftime to organize the data and normalize it, the strftime function can pass in the _time field as its first argument second argument will be the specific hour, with the difference between date_hour and my_hour is my_hour is taking into acount the time zone that is set,


I hoped this helped answer some general questions for anyone just learning Splunk. I really enjoyed doing and this and will be making more notes in the future.




Experienced Cyber Security/Intelligence Analyst with a demonstrated history of working in the US Military and IT industry.