Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Bring-your-own-key browser extension for summarizing HN posts with LLMs (github.com/ivanyu)
68 points by ivanyu 1 day ago | hide | past | favorite | 31 comments
Hi Hacker News,

I developed an open source browser extension for summarizing Hacker News articles with OpenAI and Anthropic LLMs. It currently supports Chrome [1] and Firefox [2] (desktop).

The extension adds the summarize buttons to the HN front page and article pages.

It is bring-your-own-key, i.e. there's no back end behind it and the usage is free, you insert your API key and pay only for tokens to your LLM provider.

[1] https://chromewebstore.google.com/detail/hacker-news-tldr/oo...

[2] https://addons.mozilla.org/ru/firefox/addon/hacker-news-tl-d...






Related: to save on cost, Hacker News is already summarized and available in feed form which I find better than the default front page where you have to read the same headlines repeatedly because the order of the headlines changes on the front page. https://hackernews.betacat.io/ I also dislike how the titles on hacker news are so short that often they dont give enough information. e.g. headline: "Amazon Penterhorse". What is that?? that doesn't exist but the point is I have to click through to see it and its annoying. And on some posts when I click on some links the person's blog post is just way longer than my interest level so it doesn't get the point across. These summaries are just the right length.

I just read the summary for "Fermat's Last Theorem – how it's going" on that page and it completely missed the point of the article

these “related” spams in HN are so annoying. Make your own show hn.

Would be nice to be able to provide your own endpoint so it could be directed to a local llm.

Thanks, good idea, this should be possible.

That would be a pretty killer feature IMHO. Ollama's API is pretty straightforward: https://github.com/ollama/ollama/blob/main/docs/api.md

There is also (or at least used to be?) an OpenAI compatible API layer for Ollama so that may be an option as well, though my understanding is there are some downsides to using that.

Note: This comment and the link are just meant as references/conveniences, not intended as a request for free labor. Thanks for opening up the code!


Forget ollama, just changing the URL from openai to your local server is enough, llama.cpp has a compatible endpoint. Most people just don't bother giving the option since you get a CORS error if it doesn't have a valid cert.

Neat, I didn't know that! Thanks for the tip!


I made something similar that works on any link with whatever endpoint you want (and has special handling just for HN comments) https://github.com/TetraTsunami/linklooker

I've been pretty happy lately with my setup.

Arc browser lets you hover over a link to show a card that summarizes the article.

With Claude Projects, I'm able to quickly build an Arc "Boost" User Script for any site, so I have one to export the HN homepage to JSON to import into an LLM. And I have one on comment pages to do the same. I have a userscript to remove pagination so I can infinitely scroll and then export.

Ad I have a Claude Project specifically for identifying/categorizing comment threads by the patterns of knowledge crystallization etc. It's been fascinating so far.


This is awesome! I built a similar chrome extension https://rockyai.me/ that lets you summarize and chat with any webpage using your OpenAI key for free. I use that across HN articles and a bunch of other websites :)


So you're David Gosling?

nah that's an early user :)

Finally!

I tried to build something like this a few years back [0], I thought it was a great idea, but LLMs were not available yet, and I was busy with a hundred other things.

You can see an example of the summary there.

[0]: https://github.com/simonebrunozzi/MNMN


I'd love it if it could summarize the HN comments as well.

Yeah, I should do comments. This is a popular feedback item.

I added something similar into my feed reader. Even summarizes comments. The problem is paywalls, sites that heavily use Javascript, and other bot protection measures (such as OpenAI's blog which I found a little bit ironic). But I guess you might be able to get around bot protections and JS if it's a browser extension.

Brilliant, Love it. Excellent execution.

This is so weird, why does every user has to use their LLM subscriptions/keys instead of just storing the summarization of the HN posts as a static website somewhere? (e.g. summarize once for all users)

When I was incubating the idea, I thought about different concepts:

1. The current bring-your-own-key.

2. A central summary storage, filled by me.

3. A central summary storage, crowdsourced.

4. A paid subscription, where I effectively run some LLM proxy.

I wanted something low overhead and be just the right size for yet another weekend project which I could drop at any moment. Supporting some infrastructure, having moderation headaches, let alone receiving payments ruled out pretty much everything but the current approach.


This is exactly what I built with https://hackyournews.com

That costs money to host a server, summarize yourself, etc.

While it is possible, such a service would require advertisements, a (probably monthly) fee, or a benevolent patron to pay for the costs.

With this, the only necessary benevolence is the creator of the extension.


the server cost is basically nothing since since it's static content. probably fits into free offerings from cloudflare.

the summarization is where the cost is and it would be cool to have some crowdsourced contributor model there.

smells like a cool crypto/ai crossover project in the making (or maybe drop the crypto and have upvote-driven "moderation").


.. because being a threaded extension of some central hub connected to some other central hub is exactly the "vibe" ?

This sounds possibly useful, but I just don't trust extensions unless I can see the code.

hmm even though the code is open source, how do you know that the published extension is bundled using the same code as what is open sourced? I guess I'm trying to say you'll have to be okay with some level of trust here unless you clone the open source project use it to load an unpublished extension for yourself

That's understandable, I feel same when I install extensions. In both browsers, you can install the extension from local disk instead of the browser stores. The release artifact is a ZIP file with plain JS inside, no bundling, minification, preprocessing, you can check it out. Both Chrome and Mozilla did some inspection during several business days, but I can't say exactly what they checked and how diligently.


I am an idiot for not looking at OP link. jeez. thanks.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: