Huge amount of storage compared to something like Papertrail. Any plans to work with Heroku and make this available via an add-on through their platform?
@hopkinschris Yup Heroku is available now but not thru their platform yet. We have a CLI program that'll generate a: "heroku drains:add" command. Let me know if you have trouble setting this up.
This looks great. The storage sold me. So many times I was forced to stop using a logging product because they offer a measly amount of storage.
Do you guys have plans for application specific logging integration down the line, e.g logging for NodeJS or Python applications?
@riyadhalnur Thanks for the kind words Riyadh. Yes, we'll have a full code library soon supporting many languages. We figured an agent would be sufficient for now. :) NodeJS would likely be first since we're also on Node.
@timolins Thanks for the feedback about the sign up process. That's an interesting idea to have a responsive dashboard and be able to run searches on the go! I'll pass this along to the team, thanks for the product idea!
@hexsprite Hey Jordan, thanks for the feedback! Docker...not yet. Syslog/docker is up next on our list of todo's. We did have one company during our beta who logged the syslog to a file and used that with our agent for shipping. Now I wasn't sure how easy that was to set up.
We recently signed up for Loggly, but many of their features we just don't use. And the ONE I really want (centralized 'tail -f' is on their enterprise plan). It looks like this does everything we need and nothing we don't.
One quick onboarding suggestion: The magic moment happened when I got to just just run 'logdna tail' on my machine and it "Just Worked". Magic! However, asking me to sudo install a bunch of stuff was a barrier. I would have gotten to the "magic moment" faster if I could have seen it in action first. A demo log source perhaps? Just anything I could tail without needing sudo install code fetched from various untrusted sources. Of course, after a little digging on you guys I found you were pretty legit ; )
I'll probably be upgrading shortly! (Sorry Loggly!)
@rsweetland I know exactly what you're talking about (the magic moment) and that's why we made the live demo. We debated about this...whether to have a demo source or not. But we ended up with what we have now because we had a couple goals and one was to get you to be a user with your own data (some people don't believe things work unless they see it with their own data). So yes, i totally hear you but it's hard line to draw in the sand. We may still do what you said though. The debates not over yet. :) And thanks for the feedback!
@nxbuzz No problem. I can totally see the other side as well. Either way, my story with LogDNA had a happy ending – and it was magical to see my own familiar log data suddenly streaming : )
It's hard to count the number of times I thought my life would be better if the logs weren't so hard to look through. I like the simple and clean look! Powerful too.
@angel54689 we're serial entrepreneurs who have built and sold companies and have a reputation on building scalable and secure solutions. We've worked heavily with eBay corp in the past to help launch many technology solutions. Feel free to check out our About Us page and you can click on our linkedin profile. Having been a part of YCombinator, reputation and trust is key to us. Happy to help out anytime.
How does this compare to the incredible amount of competition in this area? What makes this product different / better then Splunk, Loggly, Solar Winds, etc... ?
(Although they shouldn't) Log files can contain very sensitive information. What security measures are in place to ensure no one sees my log files other then me?
@mariogiambanco Sure. We probably won't be doing what splunk is doing. We don't intend to be enterprise-focused at this time. As for the others, we're confident we have a good infrastructure set up to handle the much larger data volumes we're giving out. 100 GB/month for free, paid plans start at 500 GB/month.
Absolutely. As I'm sure everyone in the logging space is doing, we also encrypt end-to-end traffic.
@nxbuzz I was going to mention the 100gb / month for free (correct me if I'm wrong) - it really isn't - it's a maximum of 100gb of storage / month - but the retention is listed at 2 days. If you only send 50mb a day, in 2 days you'd have 100mb and on the 3rd day, you'd be back at 0. When your talking to a systems administrator (which is what these kind of products are really targeted towards) - days of retention has more value then data stored.
Your almost advertising as enterprise storage numbers but boutique retention policies. Someone needing 100gb / month of storage would need much more then 2 days of retention. A week (7 days) retention is roughly standard on free plans I've seen.
Not trying to be mean - It looks like a very simple, clean and easy to use interface - but the marketing should be cleaned up a bit.
@mariogiambanco Yes, you're right, retention is definitely important but each plan is tailored for a specific audience.
From what've been hearing, most people on free plans use it for specific needs like testing/debugging their product in beta or a light production app. And for those, 2 days is plenty. Our free 100 GB/month means you can log all 100 GB in the first day before we'll stop ingesting and it'll stick around for a couple days. Or you can go 3 GB/day with a large burstable ceiling (up to 100 GB) as you need it. We think it's fair for what we can offer for free.
We're not trying to build a business off our free plan, our paid plans have 30+ days retention and that's where it'll matter to system administrators. And for those plans, we have generous data volume allotments.
Also I don't think we're offering enterprise storage numbers. We're simply offering a much higher limit than most people are used to seeing.
Nice! Using papertrail now and considering switch to loggly. About 20 hosts. Some questions:
1. do you have chef packages available to install the daemon ?
2. why are you not using rsyslogd ?
3. the instructions on how to join only allow you to specifiy one log path. Is there any way to define your log files in a config file ?
@digitalbase Hi Gijs,
1 & 2. Not yet. We had to choose which features would make it for the launch and these had to be pushed. But rsyslogd is on our list.
3. Yup, you can add paths one or many at a time or you can modify the config file directly at /etc/logdna.conf
Thanks for the hunt @garrytan!
Hey everyone. I’m a co-founder of LogDNA. We were in YCombinator's W15 batch working on a eCommerce marketing platform (PushMarket), only to realize we built a powerful logging system many of our friends wanted.
We had our “slack moment” and decided to pivot. We were frustrated with the current logging landscape and built LogDNA around 3 philosophies:
1. More Storage - give away ample storage so you can log everything
2. Longer Search - faster and longer search retention
3. UI/UX - a much cleaner experience to interact with your log tails
I’m happy to answer any questions you may have!
PS - We love Product Hunt and want to give an exclusive 50% off to all new members for the week! :)
Wow, this is an insanely cheap option. @mrchrisnguyen are there any plans to add in charting or analytics for specific searches in the near future? Does your tail support ANSII coloring?
@briancmuse awesome that you mentioned charting and analytics, this is our highly requested feature and is on our immediate roadmap. We don't support ANSII coloring but I have passed this request to my team.
Feel free to sign up to LogDNA and you'll be notified when we roll out new features on a weekly basis!
I set this up on 4 servers in literally 3 minutes (yes, I was counting!)
So much needed! Solved one of my biggest pain point!
There is one thing I would absolutely love - Seeing these live tailed results inside a slack channel :)
@mrdhat Thanks for the feedback Akshay!
As for the live slack tail, that is definitely possible, though Slack may not like it very much. We have considered this but they do have limits on their API for incoming data like this and they recommend a logging provider. On the other hand, you can setup an alert with a 30-second window to slack and that'll work but it'll be delayed by 30s.
@nxbuzz
> Slack may not like it very much
Makes perfect sense. I think other way can be to add a slash command/bot which does the same thing but only when user wants it. That way Slack might allow it. Not sure though.
Also, I see big potential for a slack like plugin system in LogDNA, I can think of 10 plugins that would be useful after using it just for one day! Do you have any plans to add something like that in the future?
Awesome product and the really like how clean this looks. Wish it was out time ago when I was messing in rails and having to look at a horrid console with tail -f trying to debug stuff.
Wow talk about timing! Just last week we had a bug and it created a ton of log data. We were faced with either paying the absurd overage or stop logging. This would have saved me.
@_bsiddiqui storage search retention is a main differentiator for us. We want you to have the ability to log everything without the need to worry about how much gigs you consuming. We took a fresh approach with the user experience as well regarding our backend system. I'd love to hear your thoughts about our product, feel free to try the free demo.
Our team at Pantelligent has been happily using the LogDNA beta for the last few months. The product is well designed and easy to use for many members across engineering, devops, and marketing roles. Very impressed so far, and this is only the beginning of where this product can go! Setup takes about 2 minutes -- give it a try today.
@compumike thanks Mike. We're lucky to have your team on board and you helped guide the direction to where the product is today. We're looking forward to growing with your team at Pantelligent!
@devankoshal thanks for the kind words. Glad you enjoyed the easy onboarding experience. Feel free to message us anytime with any questions or comments! Happy to help out.
An American Road Trip