Greetings,
Does anyone utilize the postgresql DB structure for the Performance Warehouse in a Production environment? I have a client that is using this with only 2 applications currently being monitored. One is an internal app, and the other external with a 50k member base generating about 4mill PP a day. 50 GB has been set for the size of the storage and that is maxing out and causing the db to run out of memory about 2GB memory (not sure exactly as I don't have rights to machine).
At this client they do not have any experts in postgreSQL and are looking for advice as to how to tune it for dT. Anyone have any thoughts?
Chris
Answer by Ramon F. ·
Yeah.
My situation:
---
I have 24 Java agents, 6 .NET and 1 Webserver/PHP in this moment. My Postgresql machine is a virtual machine with 4 vCPU and 3GB RAM.
My storage management is high resolution 3 weeks, medium 4 and low 5.
The average Measures Written is 8500 / minute.
---
Works fine here, something about 40 ~ 50 gb in hard disk.
Look the Storage Management configuration, the time that you store the information and if you do not create a lot of sensors and measures that are unnecessary , I already had problems with this making my DW stop working fine because of the load.
Hope that helps, if not, you can ask.
JANUARY 15, 3:00 PM GMT / 10:00 AM ET