Splunk is hands down the best log analysis tooling I've used. If not for the hefty price tag, I'd use it for my personal stuff and every workplace I've been. Structured logs and Splunk are the stuff dreams are made of if you care about monitoring the quality of software.
The logs into metrics abilities along with the ability to unlock finding relationships in data is amazing. Mouse over the fields found in logs matching your search and see the top N values for other these keys.
Imagine getting an alert and being able to search your logs for that error message and immediately being able to see it affects these N users disproportionally, that it is split 50/50 in two of your seven regions, only affects version X of your service. A couple more searches to dig in and you can see it is only feature Y with setting Z that is the problem. You switch to a timechart view and can see the moment the error started and the affected user counts. A few more minutes and your support team has a list of known affected users. You decide to monitor this new feature so you quickly create a new dashboard (or panel on an existing dashboard) and a new alert. At no time did you have to declare a field of your structured logs as an index or as searchable or aggregatable.
Splunk has delivered this level of innovation and quality since 2007 when I first used it.
We used Splunk to associate a change request ticket number all the way through the change control process to the Puppet log output tagging each change to the original business purpose.
It was like magic for auditors back then and I rarely see that depth of tracing automated changes to business purpose in the field today, though we get close with gitops.
The entire moat is gone. The biggest value driver they had was integrations to get the data there, but now ebpf, telegraf and Vector have destroyed that moat.
With Vector you can even source from Splunk and move elsewhere.
OMB Memorandum M-21-31[0], “Improving the Federal Government's Investigative and Remediation Capabilities Related to Cybersecurity Incidents” which includes directives to ensure event logging goes well beyond the current norms.
By all accounts I've heard it's going to enrich the fortunes of every single SIEM/Log aggregation company out there, pretty much every govt contractor is going to need larger licenses in the next few years as contracts get rewritten with this EO in mind.
Partially, but Splunk has been on the market for sometime actually. Also, large companies that compete with Cisco like CRWD, PAN, etc have been building out SIEM capabilities, as has Cisco, though Cisco being Cisco it didn't get the attention needed.
We [Notion] switched to Splunk Cloud a year or so ago, and it's vastly better than the other logging systems we've used. Much, much better than Kibana/Elasticsearch. We don't need to worry about indexed property limits anymore, yay. I'm a happy user.
Same for us [Obsidian Sync] although we've not had to worry about property limits, yet - although seems like we won't have to either. For us it was having a lot of in house experience with splunk already that gave us a reason to consider and in the end settle on it.
The software seems very lazy. The interface belongs in the 90s. They've been resting on their laurels for eons. The fuckin basic ass PowerShell IDE that comes with windows is about seventeen trillion times more well designed and user-friendly.
That’s… a compliment? There have been very few positive interface developments in the last 2 decades for power users. If you want to rip out 95% of the functionality and 99% of the usefulness so morons with iPads can navigate it, then it probably needs adjustments.
>The interface belongs in the 90s
Maybe this is why I actually like Splunk. Everything is simple an intuitive. Modern UIs seem to be universally terrible.
That's fine as long as your product stays competitive.
But as you lose the smaller and middle-range customers, you're also missing on the trends of the market, while getting shaken up by the big players you can't afford to say no to. If one of your whales needs feature Y, no matter how exotic you think it could be, you'll have to implement Y, bloating your product for the rest of your clients.
And while you're doing that, smaller competitors slowly creep up, eating up the bottom of you market, until you're stuck in a niche.
I'm in a fortune 100 and we are looking at replacing splunk for sentinel because of cost of splunk. I don't use either in my day to day and have no horse in the race, but if my company is doing it then the cost of splunk must not be trivial.
> And while you're doing that, smaller competitors slowly creep up, eating up the bottom of you market, until you're stuck in a niche.
So what, milking mega enterprise for ossified products is a decently profitable niche. IBM, SAP, that huge American company powering a lot of hospital IT, Cisco itself...
ServiceNow actually is quite decent... if you have a good management team, that is. I know a well run implementation and one that's a horrid clusterfuck no one wants to use (and because of that, they're implementing some AI chatbot, which I'm sure will piss people off even more).
I completely disagree with both the spirit of the comment as well as the particular strawman presented.
It is not better at all, by almost any metric other than overhead. Losing 1 of 1000 customers @ $1000 is very different than 1 of 1 customer @ $1M. One is easy to manage, the other leaves you dead in the water. In addition, you'd start to make concessions/unnatural decisions because you're so lopsided in diversity. And you're going to get completely fucked at renewal time. and, and and..
Good M&A teams know this. They build a risk profile when revenue is a component of the acquisition. The acquiring party gets to learn a lot about the fundamentals when putting deals together and it's all factored in.
To put it simply: having a healthy balance of revenue from multiple sources is a premium. Those are opportunities to advance your relationship and grow. Too many eggs in too few baskets are major red flags that will have your revenue working against you.
They're profitable now with Cisco acquiring them. They'll trim the company by at least 20% through restructuring either out or to other places of Cisco.
They'll pick up another 10-20% capex/open/cogs on private pricing that Cisco gets.
Great M&A if Cisco manages to maintain Splunk's customer base. I look at Splunk as the Oracle DB of the world now, does anything a giant enterprise can imagine, but is old and costs a leg & arm.
In sales we call this "Ideal Customer Profile." Why do I want a customer with less money to spend if I have a product with enough capability for the gigantic money-is-no-object customers?
I work in a 100+ year old giga bank, systemic in the country it comes from, in their Hong Kong investment bank branch.
We loved Splunk, we invested quite a bit in it both for technical monitoring and business intelligence. After a while the price went so high we cut it all, moved to kdb/tableau/elk/whatever crappier system that cost less.
Money is ALWAYS an object and Splunk makes sure to dig a hole deep enough for even the deepest pockets. I too prefer my shareholders to collect the fruit of my labor rather than... Splunk. At least they can reinvest some profit in us. Not Splunk, nope, they keep digging that hole in our pockets.
Spot on. I also work in a 100+ year old gigantic corporation with big money and we are also moving off Splunk due to rising costs. Enterprise customers do not just pay whatever the sales folks ask for. Splunk is dead growth wise if they don’t fix their pricing.
We moved a business from splunk to ELK a couple of years ago. The actual work of doing so took less than a day. The maintenance processes changed, and some things are not as good. But aside from the beefy machine we run ELK on it costs next to nothing, and is very reliable.
Mindshare is valuable, was the point GP was making. If midsize customers ignore you because you're too expensive, and then implement something else before they get big enough to afford you, where do you get new customers? Forget growth, how do you replace attrition as your existing customers die?
Personally I can't say if that's actually happening with Splunk, but it's a very plausible scenario.
I've recently dealt with multiple companies who started using IBM Aspera (which as a vendor to them means we have to use it too) only for it to work miserably. I've also seen a couple tiny, perfectly functional MySQL databases replaced by expensive, slower Oracle databases with much higher maintenance costs.
I think once a customer with a big enough budget is recognized by sales at one of these big organizations they make the sale happen. They talk to the higher-ups and either make them happy, or feed them a lot of FUD (or both), and then they're in, regardless of what the people working with the products (many of whom might be external vendors or consultants!) think.
They're basically focused on more traditional sales & marketing instead of more grassroots sales & marketing (mindshare), but at least in my experience they definitely still get new customers.
> Mindshare is valuable, was the point GP was making. If midsize customers ignore you because you're too expensive, and then implement something else before they get big enough to afford you, where do you get new customers? Forget growth, how do you replace attrition as your existing customers die?
Somehow companies manage to make it work extracting money from your existing money-is-no-object customers. Oracle and IBM have basically zero mind-share amongst HN reading folks, but yet there they are.
Microsoft dominated the nineties especially and the naughts less so but still because the marginal price of their OS was zero - due to piracy. Yes they didn't like business to run unlicensed but if you were a customer, nobody cared, because in 5-10-20 years you'd be a paying business or would work for a paying business.
Splunk doesn't get that. There are no hobbyist/prosumer splunk installations. Zero. Nada. That's also how Linux won in the server space - nobody set up Windows servers as a hobby and 20 years later we're here.
IOW it's medium-term short-sightedness, if it makes sense. Tactically good, strategically so-so to bad, depending on your moat and momentum.
> Splunk doesn't get that. There are no hobbyist/prosumer splunk installations. Zero. Nada.
Not true. I ran a free (legit!) Splunk instance in my homelab for years. It's been several years since I shut the homelab down, so I couldn't tell you if they still have hobbyist licensing, but they certainly had it in the past.
It was at one point usable but they drove off the hobbyist/small business crowd a long time ago. We do some work setting up elasticsearch tools that aggregate and filter data later sent to central splunk purely to affect a large reduction in license costs.
Kibana and Graylog on top of elastic/opensearch. Even the commerical licenses on those are usually a tiny fraction of splunk's costs, and Graylog does enough for free that it's a much easier path to stand that up and then buy the correlation functionality if you really need it.
For some organizations what Splunk does well is important but for most of them they really only need much more basic log aggregation and analysis tools.
I believe the idea is that the big customers are interested because everyone is raving about it. If you price out the smaller customers, there's nobody to rave about it.
Consider, for example, that Akamai's revenues are sitting in a plateau over the last 5 years, while Cloudflare is moving up.
> I believe the idea is that the big customers are interested because everyone is raving about it. If you price out the smaller customers, there's nobody to rave about it.
That's not how enterprise procurement works, which is what makes the big bucks for companies like Akamai and Splunk.
Cloudflare traditionally targeted mid-market and is in the process of building out an upper market/enterprise motion (I worked with the guy they hired to lead that in a previous role).
I can dig deeper into ICP, Market Segmentation, and Enterprise sales if interested. There is too much FUD on HN
>big customers are interested because everyone is raving about it.
In this case the big customers are already using it. Splunk's value proposition for those customer is that they can handle with a massive volume without a hiccup. Small customers don't have the needs where Splunk is uniquely useful.
Because those medium-sized customer become large customers and getting more people to use your product builds up skill set in people. Switching cost is very expensive. This is why we'll probably see DataDog and Newrelic dominate the logging space because of their no contract plans that you can scale up to negotiated rates when you become larger. Even getting a POC of splunk is expensive and sales team will push for a contract.
What splunk has going for it now is that they have lot invested in compliance and security but its only matter of time before other providers start offering the same. Only use case i would consider them for is a SIEM. Datadog logging is so cheap and works and gives me more money to spend on other things.
Maybe, but the Splunk query language is reasonably well liked by its users, at least in the security space. Much more approachable than SQL, which seems to be what all new tools these days are forcing users to use due to their dependence on Snowflake and Presto/Trino. In Splunk, you can type free text queries, and you can also add structure. Fairly flexible. We’ve been asked many times to make Scanner’s query lang more like Splunk’s.