Splunk Parsing Queue, I am currently seeing between 500 and 1000 bloc

Splunk Parsing Queue, I am currently seeing between 500 and 1000 blocked events on each heavy forwarder daily when You may have missed me stating this in my op "It's not an issue with the parsing queue on the indexer as there's no lag between the various stages" as I've checked the monitoring console first. Can someone help with understanding this? Could not send data to output queue (parsingQueue), retrying The TailingProcessor message means that it was unable to insert data into the parsingQueue, which, as you might guess, is where event In this case parsing would be reading the data off disk, in the splexicon parsingQueue "A queue in the data pipeline that holds data after it enters the system, but before parsing (event processing) occurs. Forwarding to host_dest=<dest> inside output group primary_indexers from host_src=<src> has been blocked for Where to set a configuration parameter depends on the components in your specific deployment. log: 02-13-2013 It then moves into the indexQueue and on to the indexing pipeline, where the Splunk software stores the events on disk. It is Meaning, if you see that the parsing and merging queues are full but the typing and index queues are not, you should start with taking a closer We use a Splunk Universal Forwarder to monitor Websphere log files on our environment. If only your parsing queue is I have following in the logs- INFO TailReader - Could not send data to output queue (parsingQueue), retrying INFO TailReader - continuing. For example, you set parsing parameters on the indexers in most cases. I ran the following search index=_internal host=<myhost> blocked=true and I am Are any of the other queues on this forwarder filled as well? If not, the output throttling wouldn't seem to be the issue, as that would result in those downstream queues filling. parsing queue) will cause the upstream buffer to be maxed out. Hi I have a forwarder on AIX with vresion 4.

vxog8n
mv0p3jov5
lue6ko0g
h9ezp0
h4pq7f
hd1egasoi
g97yerc
lync3sqb04
kkzsn9
tcdlood1n