Venturing into the real-time transaction data frontier – Part 1

It all started with one innocent customer question: “I use Splunk® Enterprise and INETCO Insight® extensively – now how do I get some of the granular transaction data INETCO Insight collects into Splunk?”

learn about the inetco netstream real-time transaction data streaming application for splunk enterprise by watching this 45 minute webcast.
Learn about the INETCO NetStream real-time transaction data streaming application for Splunk Enterprise by watching this 45 minute webcast.

Twenty questions later, we learned that our customer was currently putting every log they could find into Splunk, and was building out a great view of infrastructure events.  Now they were interested in forwarding a complementary source of data – real-time application transaction performance and security events – from INETCO Insight into Splunk. Why? Because they saw great potential in combining log data with transaction data to expedite root cause problem isolation and gain a much broader view of how their end-to-end IT environment was performing and serving customers.

Feeling like our customer had the beginnings of a really good idea, we did enough validation to convince ourselves that there was a product opportunity here, and began work on what we now call INETCO NetStream.

Our mission was to provide an elegant, painless way for people to win their battles against the data bulge. How could we make the extraction of timely, actionable data easier?  What could we provide that would help isolate performance issues faster?  How could we help people make better business decisions?

Getting bored of our office walls and post-it notes, we decided it was time for us to kick start this adventure and figure it out. First stop on our journey was the Splunk .conf2013.  Talk about Big Data explosion.  We learned…well, a lot.

After 600+ conversations with various Splunk users spanning application performance management, IT operations and security analytics, we verified that each person has their own unique needs when it comes to data (more on that in the next few blogs to follow).  Almost every person admitted that their existing logs were not enough.  Right – good check point.

We heard many people talk about the opportunities their Splunk implementations have opened up for them from an operations intelligence perspective, but also comment on how making good sense of ever-growing Big Data is still becoming a harder thing to do .  When we asked why, some of the key challenges IT operations, security analysts and application support teams sited were:

  • Their data was not real-time
  • The majority of the Big Data they were looking at wasn’t that helpful
  • Usage was limited to only those who know the application well enough to know exactly what to search for
  • Security restrictions on production systems meant access to production logs was often limited
  • Both the access and data available through third party and packaged application logs was inadequate

What also became clear was the people that seemed to be winning the Big Data battle were doing so because they have become proficient in the key practices behind harnessing operational intelligence:

  • Extracting timely, actionable data from their IT environment
  • Searching data to identify patterns and one-off anomalies
  • Analyzing data to make good business decisions

We also learned that the transaction intelligence INETCO NetStream was proposing to provide, such as transaction request and response timings, network address data and full message payload information, was key for deriving actionable application performance intelligence.  The fact that we are able to pull this information off the network in real-time, with no hardware devices, scripting or custom log file development required, definitely caught people’s attention.  Our plans to give users the chance to filter the exact data they need, prior to Splunk ingestion, also generated a lot of nods.  When we showed our proposed data model, known as the Unified Transaction Model, they were elated with the ease in which they could actually understand this depth of data.  Nice to verify people were interested in receiving more than a free t-shirt.

Definitely a successful first outing.  As this journey continues with last week’s launch of the INETCO NetStream beta program, and SplunkLive NYC on November 12th, we are excited to continue learning and building out all the possible use cases for transaction data.  Tune into the next 3 blogs as we showcase how INETCO NetStream answers questions such as:

  • Why is a particular application slow, and how are user transactions being affected?
  • What unusual web or database requests were made today?
  • Are slow or failing servers costing us money or users?

For an introduction to INETCO NetStream and the power of transaction data, watch this October 29th webcast titled, “You Can’t Always Log What You Want.”  If you are interested in learning more, there is still time to join the beta program, or follow us at:

Blog: www.inetco.com/blog
Twitter: @INETCONetStream
Facebook: www.facebook.com/INETCONetStream
LinkedIn:  www.linkedin.com/company/inetco