Becoming a Scientist with your Monitoring Data
Emily Duncan
Thought Leader | Customer Advocate | Manager Systems Engineering | Majors | Palo Alto Networks
Yesterday my kids and I read all about the Blood Moon Lunar eclipse as we waited patiently for 11:12 pm to arrive so we could witness it ourselves (seeing is believing). I started thinking about how science could be so exact with a date and time right down to the minute, that the eclipse would actually happen? How did they know it would be blood red? How could they tell it was a super moon? I started thinking about the way we often approach monitoring - our data, our systems, and even our lives. We wait for something to go red, and then we act. Red alerts, people calling and yelling, stress levels rising, bodies failing - it is the most common incentive to act. Why? Scientists have figured it out, observe, measure, prove, predict. They've got it down to 'a science', so much so that we trust them when they tell us that at 11:12 pm CDT, there will be a Lunar eclipse and the moon will turn blood red. We trust their data.
Monitoring can be that way IF you don't just 'react' to events. Events are state changes meaning, something has already happened. How in the world could we ever be proactive or predictive if all we ever pay attention to - is a state change. Threshold passed 95%, response time slow, disk space 90% utilized. If scientists never looked at the big picture, what the moon looks like in a normal state each and every minute, they never would be able to predict with 99.9% certainty when the moon would be full let alone a blood red lunar eclipse.
It is absolutely a process change. We are programed to 'react' when things go red and when they aren't - well, why add one more thing to worry about when everything is still green. Legacy monitoring tools were designed to 'alert' when something goes bad. They were not programmed to tap you on the shoulder, send you an email, or a page - when all was right with the world. Why not? Shouldn't the monitoring solutions you put in place also be able to show you, what normal looks like? Shouldn't they be able to take events, marry them to metrics, and present the entire picture of what your environment looks like when it is - and when it isn't performing to optimal capacity? How do you even know what optimal looks like if all you ever pay attention to - is less than optimal? What if optimal changes, do your monitoring tools alert you that what was a red threshold last week or last month, has become normal today? Why in the world in 2019 are we not becoming scientists of our data?
We are 3 weeks in to a new year, many of us made new year's resolutions. I am pretty sure most of us didn't make resolutions like, "If I have a heart attack I will eat healthier and exercise more." Instead we take the opportunity to start fresh, to begin a year with a different perspective. I challenge you to do that same thing, make a new year's resolution about the way you approach monitoring your data. Figure out if what you are doing right now, truly gives you enough insight to be able to predict things like scientists do - on May 26th, 2021 in the early hours of the morning there will be another Lunar Eclipse with a Blood Moon and it's also only going to be red for 15 minutes, so act quickly.
We don't all have to be scientists to predict the future, we just have to pay attention to our data - even when all seems right with the world. #splunk #splunkITSI #splunkITOA