Netdata Community

Anomalies collector feedback megathread!

I just wanted to create a thread in here to capture feedback, in one place, from users of the anomalies collector:

I would love to hear all positive and negative feedback so that we can incorporate it into future versions of the collector or even new ones we might try do in Go or slightly different ones looking at more specific things like changepoint detection or online/streaming anomaly detection.

If you would rather, feel free to also just drop a mail to Although ideally i was hoping to get some discussion going in here for all the netdata community who might be interested in these sorts of feature.


I was wondering last week if it’s possible for our team to create some preconfigured custom models for popular applications like MySQL, PostreSQL, OpenVPN, etc? This way users wouldn’t have to set up the charts_regex themselves, but instead could just say “yes” to a custom model that helps them monitor the most critical types of anomalies for that application?

If we had some smart default custom models, we could then mention them as a novel, zero-configuration way to monitor X or Y application.

I think that’s a great idea. I guess we could just add them as commented out sections in the conf. Or maybe a better way would be a sort of library of conf’s for each use case - just not sure where such stuff would live - a library of reference conf’s for each collector? Maybe some sort of sub section in the docs perhaps.

Hi guys
So I started playing with anomaly detection yesterday. I firstly added an SNMP collector to collect data from internal and internet facing interfaces on my mikrotik router.

I then created the config for anomaly detection for the SNMP collector. So far so good and will test over the weekend to see how it goes and provide feedback:

I have some questions though:

  1. It looks like the default training schedule is every 30 minutes using the previous 4 hours data. Does it also use the current model in it’s retraining? Ideally you would not like to only have 4 hours worth of data to use for ML.
  2. What happens on restart of the agent? Is the model saved or does everything start from scratch?