I have a requirement to send Dynatrace data to Splunk. I am aware of the realtime streaming capability to Splunk. In this case, DT is sending data to Search head server where flume is running.
We are looking for a solution where DT data can be posted to a different server from where Splunk forwarders will do the rest of the job.
Any suggestions on this?
Answer by Ari P. · Mar 20 at 04:21 PM
You can definitely do what you are asking. In fact, that is what we did at my client site. Here's what we did:-
Install a Splunk Heavy Forwarder on a different server, install the Flume server and the Splunk app on the Heavy Forwarder, send the AppMon data to this Flume server and then query the indexes on the Heavy Forwarder from the main Splunk Infrastructure.
Let me know if you have any questions.
Answer by Ari P. · Mar 21 at 12:55 PM
You are correct. Installing the app takes care of the Flume process as well.
Yes, we do have a scheduled clean up of the AppMon data folders. We do ours every week. Since the data is indexed to our actual Splunk infrastructure in almost real time and stored there, you don't really need those folders to sit there and grow in size.
We don't really use the dashboards either. We mostly use our integration to send Business Transaction data from AppMon into Splunk. The BT Feed does write all the data in three separate folders (one for PurePaths, one for User Actions and one for Visits). Same for alerts as well (it has its own folder). You can essentially treat these folders as log files and hand it over to the Universal Folder.
I personally do not have any experience with the Splunk HEC so I can't really speak to that.
Hope this helps,
Answer by Harin Y. · Mar 21 at 12:48 PM
Thanks @Ari P. That helps. I believe installing the app takes care of Flume as well.
So, you do a scheduled clean up as well using flume?
We won't be using the dashboards which comes with Splunk app. I believe you suggested Heavy forwarder so that i can install app and Flume comes with it. What if i just have to write the Dynatrace data as log files in a different server and handover the job to Universal forwarder?
Also, i heard about Splunk HEC. Does that do a better job?
Appmon - Splunk Integration 2 Answers
Host Metrics to Splunk Errors 1 Answer
Export BT into splunk VIA logstash-5.2.1 3 Answers