• Forums
    • Public Forums
      • Community Connect
      • Dynatrace
        • Dynatrace Open Q&A
      • Application Monitoring & UEM
        • AppMon & UEM Open Q&A
      • Network Application Monitoring
        • NAM Open Q&A
  • Home /
  • Public Forums /
  • Application Monitoring & UEM /
  • AppMon & UEM Open Q&A /
avatar image
Question by Kathy H. · Jan 27, 2014 at 05:19 PM ·

Comparing cloud vs on premise

We are researching moving our application to the cloud.  Primarily we are looking at our batch tier and how it performs in the cloud vs on premise.  Our batch runs multiple jobs in parallell, some run for hours.  Each is run in its own JVM, and is represented in a separate purpath. 

We have Dynatrace in the cloud and on premise.  We run the same batch jobs with the same data in each location.  Just glancing at the first testing results looks like i/o methods take longer in the cloud.  For example java.io.unixFileSystem methods. 

I am looking for a way to compare the set of batch jobs on premise vs in the Cloud.  For example is there a way to compare the method breakdown for the set of batch jobs to understand the method hotspots in each location?

 

Comment

People who like this

0 Show 0
10 |2000000 characters needed characters left characters exceeded
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Toggle Comment visibility. Current Visibility: Viewable by all users

Up to 10 attachments (including images) can be used with a maximum of 50.0 MiB each and 250.0 MiB total.

2 Replies

  • Sort: 
  • Most voted
  • Newest
  • Oldest
avatar image

Answer by Kathy H. · Jan 30, 2014 at 08:47 PM

Yes most methods are from auto-sensors. 

There are hundreds of batch jobs (and purepaths) in a batch run.  I was hoping for a way to characterize the entire batch run for all purepaths.  I.E. java.io methods are 20% slower i the cloud. 

Is your recommendation analyzing one purepath at a time? 

Comment

People who like this

0 Show 1 · Share
10 |2000000 characters needed characters left characters exceeded
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Toggle Comment visibility. Current Visibility: Viewable by all users

Up to 10 attachments (including images) can be used with a maximum of 50.0 MiB each and 250.0 MiB total.

avatar image Rick B. · Jan 30, 2014 at 09:18 PM 0
Share

Hi Kathy,

The API Breakdown comparison as outlined by Andreas above should help you achieve this at the larger scale.  If you have particular packages you're concerned about you can define an API for these methods in the System Profile for a more tailored comparison via this dashlet (System Profile - APIs), otherwise we will model the data into known/detected APIs.

Hope that helps,

Rick B

avatar image

Answer by Andreas G. · Jan 28, 2014 at 07:54 AM

Hi Kathy

I would build a dashboard that has the API and Method Dashlet on it and then set the source to a session that contains a PurePath for onPremise and the Comparison Source to the session containing a "cloud" PurePath.

The challenge in your case will be that most of these methods that you see, e.g: unixFileSystem are probably picked up by our Auto Sensors - correct? They show up with a grey icon in the PurePath Tree?

Right now there is no good comparison of Auto Sensor Data. But - if you start with the API Breakdown you will see which API shows the biggest difference. The Method Dashlet will give you an indication on which "instrumented" methods have a difference. Then I would open two Method Hotspot dashlet side-by-side. The Method Hotspot doesnt provide a built-in comparison feature like API or Method, but - you can have them side-by-side and set a different data source for each. Then you can compare and see which method hotspots are different -> it is a visual comparison but it should work well - especially because your purepaths are running that long and the hotspots + differences should be easy to spot

hope this helps

Comment

People who like this

0 Show 0 · Share
10 |2000000 characters needed characters left characters exceeded
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Toggle Comment visibility. Current Visibility: Viewable by all users

Up to 10 attachments (including images) can be used with a maximum of 50.0 MiB each and 250.0 MiB total.

How to get started

First steps in the forum
Read Community User Guide
Best practices of using forum

NAM 2019 SP5 is available


Check the RHEL support added in the latest NAM service pack.

Learn more

LIVE WEBINAR

"Performance Clinic - Monitoring as a Self Service with Dynatrace"


JANUARY 15, 3:00 PM GMT / 10:00 AM ET

Register here

Follow this Question

Answers Answers and Comments

2 People are following this question.

avatar image avatar image

Forum Tags

dotnet mobile monitoring load iis 6.5 kubernetes mainframe rest api dashboard framework 7.0 appmon 7 health monitoring adk log monitoring services auto-detection uem webserver test automation license web performance monitoring ios nam probe collector migration mq web services knowledge sharing reports window java hybris javascript appmon sensors good to know extensions search 6.3+ server documentation easytravel web dashboard kibana system profile purelytics docker splunk 6.1 process groups account 7.2 rest dynatrace saas spa guardian appmon administration production user actions postgresql upgrade oneagent measures security Dynatrace Managed transactionflow technologies diagnostics user session monitoring unique users continuous delivery sharing configuration alerting NGINX splitting business transaction client 6.3 installation database scheduler apache mobileapp RUM php dashlet azure purepath agent 7.1 appmonsaas messagebroker nodejs 6.2 android sensor performance warehouse
  • Forums
  • Public Forums
    • Community Connect
    • Dynatrace
      • Dynatrace Open Q&A
    • Application Monitoring & UEM
      • AppMon & UEM Open Q&A
    • Network Application Monitoring
      • NAM Open Q&A