DrupalCon Portland 2013: LOGGING EVERYTHING (AND STAYING SANE)
Websites, especially busy ones, can generate a lot of data. And when something goes wrong, you need that data to figure out what exactly happened. Or perhaps you have one user that is persistently experiencing an issue that no one else is reporting.
How are you supposed to get to those logs that conveniently disappeared during the last catastrophic meltdown? When you're generating thousands of log entries per second, how are you supposed to find the real issues that your users are having?
This session will show you the advantages of shipping and parsing your logs with Logstash, storing them in Elasticsearch, and navigating them with Kibana. We'll also discuss shipping options from the server side as well as different ways to get logs from Drupal into Elasticsearch.
How are you supposed to get to those logs that conveniently disappeared during the last catastrophic meltdown? When you're generating thousands of log entries per second, how are you supposed to find the real issues that your users are having?
This session will show you the advantages of shipping and parsing your logs with Logstash, storing them in Elasticsearch, and navigating them with Kibana. We'll also discuss shipping options from the server side as well as different ways to get logs from Drupal into Elasticsearch.