top | item 39456113

Show HN: NotesOllama – I added local LLM support to Apple Notes (through Ollama)

156 points| rexec | 2 years ago |smallest.app

This lets you talk to local LLMs in Apple Notes. I saw Obsidian Ollama (https://github.com/hinterdupfinger/obsidian-ollama) and thought it was handy, but I'm too lazy to migrate away from the Apple ecosystem, so I quickly hacked this together. I tend to use Notes as a scratchpad for prompts, so it's nice to do some quick inference without leaving the app.

Notes doesn't really support plugins so I'm using the macOS accessibility API for reading selections and then stream responses using the clipboard (not ideal but it works).

31 comments

order

marcellus23|2 years ago

have you considered using Services for this? Services already support taking selected text and doing transformations on it, and allow user-configurable keyboard shortcuts. Plus it will work in any app, not just Notes.

rexec|2 years ago

That feature is so well hidden I kind of forgot it exist. I'll check out the selection/transformation stuff!

jwells89|2 years ago

Echoing this. Services and other types of macOS system plugins (e.g. color palette plugins) are great with how they enable app-agnostic functionality but are unfortunately underused.

rcarmo|2 years ago

For those of you who want services, I went and hacked this together in 5 minutes:

https://gist.github.com/rcarmo/f96c659f149e357e1091cbfe352af...

You can drop the Python script into Automator and use that to publish a Service without tweaking anything else (but of course a proper solution would do some sort of API key management, maybe use the keychain module to retrieve that, etc.)

Still, it was a fun quick hack. Thanks rexec for the inspiration, I'm now looking into doing the same with a simple Lua/Obj-C app to publish the services directly.

jasonjmcghee|2 years ago

I'm a big fan of this space and have been hacking on it too.

With a few tricks you can use LLMs or anything else you can call from a script, from anywhere in your OS via input capture and simulation, and clipboard. And it can be cross platform!

Here's the project, if it might interest anyone.

https://github.com/jasonjmcghee/plock

rcarmo|2 years ago

This is a very clever stunt. I do want to echo other people's mentions of Services, which I use for many things and will span multiple apps.

smcleod|2 years ago

I highly recommend MindMac (https://mindmac.app) which adds os-wide support for Ollama (and "Open"AI et el) along with optional clipboard access and text entry. Unfortunately it isn't open source.

neom|2 years ago

I can't wait for higher-quality built-in proofreader on os x. I don't love real-time spell checkers, I'd prefer to just dump an email with no error highlighting, hit a keystroke and have it re-written proofed (with dyslexia often the only way I can find my own errors is by reading the text backwards). I built a GPT in the "built your own GPT" thinger ChatGPT has now, and I copy/paste stuff in, hit enter and it sends back a corrected version and underneath a list of the corrections it made. So far it's batting 1.000. I'm going to see if I can use this for that.

bugglebeetle|2 years ago

FWIW I just ran a bunch of tests with GPT 4 and it’s remarkably bad at spelling and grammar correction.

andy_xor_andrew|2 years ago

On this topic (using local LLM for analyzing local text on an iDevice) -

I highly suspect that the recent Journal app from Apple, which auto-installed via an iOS update, is intended to incentivize users to journal and write about their daily lives, so that when Apple inevitably ships a local LLM on iDevices, there is already a corpus of data for the model to RAG over and use to "understand" the user.

aaronbrethorst|2 years ago

this looks cool. I have a request: I've been using the Notes app for my todo lists since mid-2020. I have one note per day, and I then break them up by quarter and year. e.g.:

    2024
      Q1
        Monday, January 1, 2024:
          - [x] [XYZ] - Review pull request from [Person]
          - [x] Investigate error rates for [XYZ] in Sentry
I look back through these documents for two reasons:

1. Right now, I have to go back through all of my notes from the past week for engineering sync meetings to assemble a list of completed tasks, then I group them by functional area, and then I write out a little bullet-pointed synopsis that I share with my peers.

2. I use my completed todo lists from the previous year to help fill in my annual performance evaluation. I look back through the entire year for major projects I worked on.

I'd love to speed up both of these processes by pointing an LLM at all of these documents and having it auto-summarize either on a weekly, monthly, quarterly, or yearly basis.

rexec|2 years ago

Thanks! There's a "Summarize selection" prompt in there, so if you try it with a good model like Mixtral you might already get good results for (1). For (2) I'm guessing you'd want to be able to write a custom prompt?

codazoda|2 years ago

I love to read about this type of stuff. Have you blogged about or written about it in more detail here on HN?

Edit: I reworded my question because it was a bit vague.

al_borland|2 years ago

Another option for hacking something like this together could be HammerSpoon. I’ve spent some time with it, but haven’t tried integrating with Apple Notes, I mostly did stuff at the file system level to keep it easy.

https://www.hammerspoon.org/

great_psy|2 years ago

What is the intended use for this ?

Notes app seem like an odd place to ask Google like questions.

Is it supposed to help me with writing long for text? Am I supposed to use it as a spell/grammar checker ?

This is not directly a question for your integration, but more of a general question for using local LLMs for long form text.

ukuina|2 years ago

Some people have years (or decades!) of text notes that would benefit greatly from summarization and LLM query.

Asking generic questions is probably a poor demo choice, but it shows the link to the LLM in context.

taude|2 years ago

I just watched this Tiago Forte video [1] on the new Google tool called NotebookLM. In the video he basically aswers the "what" of your question. Lots one can do with a boatload of notes you've kept locally, and your own sources that aren't scraped by a LLM.

[1] https://www.youtube.com/watch?v=iWPjBwXy_Io

ehack|2 years ago

I compiled from source, works nicely. You need to change the settings to "run locally" by default, rather than build an app.

cadr|2 years ago

Seems hugged. Look forward to looking at it when the site is back up - I want this exact sort of thing.

rexec|2 years ago

Should be up now (it's a static site behind Cloudflare)

syntaxing|2 years ago

Love that workaround, extremely clever

Jommi|2 years ago

This is dope!