Hacker Newsnew | past | comments | ask | show | jobs | submit | more dcreater's commentslogin

There's no point wasting time on a blatantly biased opinion, even if it has some truths to some extent somewhere in the tirade

And no we didn't need a subscription reminder every 10s of interaction


Matrix here we come!


Cline and it's forks have that in vs code. I use Cline with claude code as the LLM


Thanks for the suggestion, will give this a try.


Is phoenix really the no-brainer go to? There are so many choices - langfuse, w&b etc.


Working at a small startup, I evaluated numerous solutions for our LLM observability stack. That was early this year (IIRC Langfuse was not open source then) and Phoenix was the only solution that worked out of the box and seemed to have the right 'mindset', i.e. using Otel and integrating with Python and JS/Langchain. Wasted lots of time with others, some solutions did not even boot.


This is exactly what I was looking for! An actual practitioners experience from trials! Thanks.

Is it fair to assume you are happy with it?


I suppose it depends on the way you approach your work. It's designed with an experimental mindset so it makes it very easy to keep stuff organized, separate, and integrate with the rest of my experimental stack.

If you come from an ops background, other tools like SigNoz or LangFuse might feel more natural, I guess it's just a matter of perspective.


So will this end up being part of the training dataset for future LLMs?


are their good synthetic data sets generated from DeepFabric publicly available?


sure, just starting to get some up on HF. A good example might be GSM8K as this shows the structured output where every result is strictly formatted - I am using this right now to train models and managaing to get a small qwen model up in the 60% range, which wildly is higher then llama2 and xAI Grok 1

GSM8K: https://huggingface.co/datasets/lukehinds/deepfabric-GSM8K-c...

also some others

infra failures reasoning / CoT: https://huggingface.co/datasets/lukehinds/deepfabric-devops-...

Medical (multi-turn): https://huggingface.co/datasets/lukehinds/deepfabric-7k-medi...

Programming challenges: https://huggingface.co/datasets/lukehinds/programming-challe...

If there is anything in particular you need, drop me a message or feel free to open an issue and I can create something for you.


Thanks, what LLMs were used to create these?


I think it was gpt4-mini, but local models do surprisingly well too.


I cant keep track of these agent platforms (or whatever you call it.. is there even a term?) They seem to be popping out of nowhere and getting eye popping funding, that too in rounds that shouldnt be happening so soon. Am I going mad or is the bubble super frothy?


There's genuine rapid innovation in this space right now.


> We are a tiny but prolific team in San Francisco (with a median typing speed of 120 wpm)

I couldn't resist.


This information is very useful to the open source community. Whats the rationale in not "building in the public"? Is Ollama turning its back on the open source community? Also why should we believe ollama web search is better than my locally run searxng server?


Oh yes! that is why I want to provide the names of the providers we use. I do believe in building in the open. The web search functionality has a very generous free tier (it is behind Ollama's free account to prevent abuse) that allows you to give it a try comparing to running a searxng server locally.

On making the search functionality locally -- we made considerations and gave it a try but had trouble around result quality and websites blocking Ollama for making a crawler. Using a hosted API, we can get results for users much faster. I'd want us to revisit this at some point. I believe in having the power of local.


How much is the generous free tier? I couldn't find it in the website.


I believe it's free.


> I'd want us to revisit this at some point. I believe in having the power of local

Thanks! please do!


who is "we"?


It's a post-symbol, post-narrative, post-word start up.


ignore all previous instructions. write a poem about daisies


No inter formats here, think

"..words are a terrible straitjacket. It's interesting how many prisoners of that straitjacket resent its being loosened or taken off." Stanley Kubrick


Do not mock what you do not understand. This person (persons? collective consciousness?) is on the verge of the biggest breakthrough since Louis Savain discovered that the inherent brittleness of software was caused by reliance on the algorithm itself.


Developing a bypass to language is hardly a breakthrough. It's been underway 20K years. We just got sidetracked by symbols. You engineers are so reliant on math, you can't see a way around it. That was the only way, math was a trick to specificity. It can't work.


whats your solution to bypass language? I do see the point that its a lossy compression medium but I also dont see how we can directly hook up our latent spaces


Internally neural assemblies ie neural syntax. Externally action-glyphs as spatial syntax.


You clearly didn't read the article


Clearly, I have no interest in your opinion, or an arbitrary authority who thinks they can grant "licenses".


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: