The unique feature of Zasper is that the Jupyter kernel handling is built with Go coroutines and is far superior to how it's done by JupyterLab in Python.
Zasper uses one fourth of RAM and one fourth of CPU used by Jupterlab. While Jupyterlab uses around 104.8 MB of RAM and 0.8 CPUs, Zasper uses 26.7 MB of RAM and 0.2 CPUs.
Other features like Search are slow because they are not refined.
I am building it alone fulltime and this is just the first draft. Improvements will come for sure in the near future.
IPython maintainer and Jupyter dev (even if I barely touch frontend stuff these days). Happy to see diversity, keep up the good work and happy new year. Feel free to open issues upstream if you find lack of documentation or issue with protocol. You can also try to reach to jupyter media strategy team, maybe they'll be open to have a blog post about this on blog.jupyter.org
The actual RAM issue is another one. Every Python kernel you start consumes around 100-150MB RAM. So unless you are starting different kernels using Zasper, the majority of RAM usage is still going to be the same.
Can I sway you to take this into a ... certain direction?
From my POV any browser based editor will be inferior to emacs (and to lesser extent vim) simply because it won't run my elisp code. While a fresh and snappier UI compared to eg jupyter would be nice, I would love to see something that integrates well with emacs out of the box.
So, perhaps it would be really nice if the backend+API was really polished as an end product itself in such a way that it could easily interface with other frontends, with remote attachment.
I could go on with my list of demands but I would be thrilled and amazed at my luck if even those two happen...
Congratulations on the launch! It's great to see alternatives to Jupyter. JupyterLab is an excellent, however creating editor for broad audience is challenging. I've found Jupyter difficult to use, especially for beginners. Managing kernels, Python environments, and installing new packages can be quite cumbersome. Are you planning to address these challenges in Zasper?
I'm not directly involved with extending Jupyter Lab, but I'm involved with the results (and testing) of our extension on the daily basis. What I find very often to be the source of complaints is the error reporting. In particular, the kind of error reporting that just disappears from the screen after few seconds. If there's one singular feature of Jupyter Lab that I really want changed, it's this.
Just wanna say this is a really cool project, and I can't think of higher praise than me hoping I build something as cool as this some day! I've been meaning to learn Go for sometime now, and will be referring to Zasper for the future :)
It's probably an unrelated post (apologies in advance) but I wanted to shoutout to the Marimo (https://marimo.io), it's the only Jupyter alternative that really got me excited, it's like Streamlit and Jupyter had a kid (and the kid took the best genes from both).
>> marimo notebooks are pure Python and stored as .py files
That sounds like a solid improvement. I’m going to give this a test drive. I feel like modularity is one of the hardest aspects of Jupyter notebooks in a team environment.
I’d be interested to hear if anyone has cracked a workflow with notebooks for larger teams. Notebooks are easy for solo or very small teams, and the literate programming style benefits still apply in larger teams but there’s a lot of friction: “hey just %run this shared notebook with a bunch of useful utilities in it - oops yeah it tries to write some files because of some stuff unrelated to your use case in there (that’s essential to my use case)”
My current best that I know of is to keep “calculation” (pure) code in a .py and just the “action“ (side-effectful) code in the notebook. Then as far as physically possible, keep the data outside of notebook (usually a database or csv’s). That helps avoid the main time sink pitfalls (resolving git conflicts, versioning, testing etc) but it doesn’t solve for example tooling you might want to run - maybe mypy against that action code - sure you can use nbqa but… interested to learn better approaches.
marimo is really cool, albeit "pure python" is only true insofar as the diff is concerned. other than that, it's an unconnected group of functions that need the marimo runtime to stitch together.
would be cool if marimo could "unroll" the compute graph into a standalone python script that doesn't need the marimo library
What’s the advantage of this? It isn’t obvious to me that reducing memory usage and CPU of an empty/idle kernel is all that meaningful if the actual Python code in your notebook uses far more resources. It’s also not obvious to me how Go’s better threading helps, either, if all the computational bits are in Python anyway.
I have one nit with JupyterLab. When I press Ctrl+F, it takes ~0.4 seconds for the search box to open, and sometimes the first keystroke doesn't register when I type something into that search box.
"Zasper ... provides ... exceptional speed".
If they can just make input latency indistinguishable from vim, that's a very worthwhile value add.
It is quite beneficial for people who aren't writing python. And for them managing jupyterlab installations is a bit of pain.
I would like to use this with xeus kernel for sql (which is also native) and if this reduces the resource consumption of that setup significantly, its a big plus for me.
Yes, the problem with such projects is that the must be very clear benefits for users (rather than developers) to attract a critical mass. At work we had Apache Zeppelin running on the servers alongside with Jupyther. In practice almost nobody used it (probably because almost nobody else used it, so if yourun into any issues you're on your own), so it was quietly shelved after a few years.
- The UI is over bloated and bugged, sometimes things scroll, sometimes they don't, sometimes you have to refresh the page. You cannot easily change the UI as lots of CSS parts have hard coded fixed sizes.
- The settings are all over the place, from py files in ~/.jupyter to ini files to auto generated command line parameters.
- The overall architecture is monolithic and hard to break down, jupyter proxy is a good example of the hacks you have to go to to reuse parts of jupyter
- The front end technology (Lumino) is ad hoc and cannot be reused, I had to write my own react components basically reimplementing the whole protocol, come on its 2025.
- The whole automation around nbconvert is error prone and fragile
While it looks like a great effort was put into this, an alternative has to support the same platforms, languages and related tooling, not run only on macOS, partial support on Linux, and IPython.
Then all the performance improvements by using Go, are taken away by using Electron.
For a fully fledged web app that all the major code notebooks tend to be, Electron makes a lot of sense. The bundled webviews built into OSes tend to be weak and outdated compared to the Chromium build that comes with Electron.
It's why Jupyter fits pretty well into VSCode/VSCodium.
> 5. Rendering your app
> Electron uses Chromium under the hood so your user sees the same on Windows, Linux and macOS. Tauri on the other hand uses the system webview: Edge Webview2 (Chromium) on Windows, WebKitGTK on Linux and WebKit on macOS. Now here comes the bad part, if you are a web developer you know that Safari (Based on WebKit) is always behind a step from every web browser. Just check out Can I Use. There is always a bug that you are not seeing from Chrome, only your dear Safari users. The same issues exists in Tauri, and you can't do anything against it, you have include polyfills. The winner has to be Electron here.
This looks pretty nice - this is specifically replacing the JupyterLab frontend and keeping the connections to Jupyter kernels - there shouldn't be any theoretical reason that it couldn't support Javascript or other language kernels, although I guess the project has only been tested with IPython kernels.
I'll look later if this is allowed but I would love love an rstudio like interface in Jupyter. Being able to control enter to run a block of code (not line) in the accompanying repl is huge for iterating and building new things
As an example I love jupyterlab's "open console for notebook" but can't find a way of sending copied text to it, or switching focus with a keyboard shortcut
It's a big reason I can't do vscode Jupiters implementation
What's the point of this? Only benefit seems to be decoupling frontend in react. Nobody complaints about Jupyter performance. You can just build frontend and keep Jupyter as it is, it is already concurrent enough for multiple users use cases.
> ... A Modern and Efficient Alternative to JupyterLab ...
This is not meant as criticism, just perspective. It's a classic development sequence:
* A team creates a powerful, small-footprint, REPL environment.
* Over time people ask for more features and languages.
* The developers agree to all such requests.
* The environment inevitably becomes more difficult to install and maintain.
* A new development team offers a smaller, more efficient REPL environment.
* Over time ... wash, rinse, repeat.
This BTW is what happened to Sage, which grew over time and was eventually replaced by IPython, then Jupyter, then JupyterLab. Sage is now an installable JupyterLab kernel, as is Go, among many other languages, in an environment that's increasingly difficult to install and maintain.
Hey -- just saying. Zasper might be clearly better and replace everything, in a process that mimics biological evolution. Can't leave without an XKCD reference: https://xkcd.com/927/
There is no such thing. There are Jupyter kernels. JupyterLab is just one of many UIs that speak the Jupyter protocol. Other examples include the original Jupyter notebook editor, VSCode Jupyter extension, and now Zasper.
I'm pretty sure Sage was always intended as a project that integrates the world, never "small footprint".
Sagemath offers a different purpose which is scientific computing in order to compete with Mathematica and MATLAB. It offered a good interactive notebook interface which went on till about 2016, and later on was migrated to using the jupyter backend. It currently isn't well supported in Windows which is what you might have meant by the complexity. However it works pretty well with linux systems.
[+] [-] prasunanand|1 year ago|reply
The unique feature of Zasper is that the Jupyter kernel handling is built with Go coroutines and is far superior to how it's done by JupyterLab in Python.
Zasper uses one fourth of RAM and one fourth of CPU used by Jupterlab. While Jupyterlab uses around 104.8 MB of RAM and 0.8 CPUs, Zasper uses 26.7 MB of RAM and 0.2 CPUs.
Other features like Search are slow because they are not refined.
I am building it alone fulltime and this is just the first draft. Improvements will come for sure in the near future.
I hope you liked the first draft.
[+] [-] carreau|1 year ago|reply
[+] [-] zelphirkalt|1 year ago|reply
[+] [-] shwouchk|1 year ago|reply
Can I sway you to take this into a ... certain direction?
From my POV any browser based editor will be inferior to emacs (and to lesser extent vim) simply because it won't run my elisp code. While a fresh and snappier UI compared to eg jupyter would be nice, I would love to see something that integrates well with emacs out of the box.
So, perhaps it would be really nice if the backend+API was really polished as an end product itself in such a way that it could easily interface with other frontends, with remote attachment.
I could go on with my list of demands but I would be thrilled and amazed at my luck if even those two happen...
[+] [-] pplonski86|1 year ago|reply
[+] [-] crabbone|1 year ago|reply
[+] [-] petre|1 year ago|reply
[+] [-] _venkatasg|1 year ago|reply
[+] [-] tudorizer|1 year ago|reply
I'd be keen to offer it as an alternative to Jupyter on my little GPU platform experiment.
[+] [-] klooney|1 year ago|reply
[+] [-] filmor|1 year ago|reply
[+] [-] llm_trw|1 year ago|reply
[+] [-] _l7dh|1 year ago|reply
[+] [-] CraigJPerry|1 year ago|reply
That sounds like a solid improvement. I’m going to give this a test drive. I feel like modularity is one of the hardest aspects of Jupyter notebooks in a team environment.
I’d be interested to hear if anyone has cracked a workflow with notebooks for larger teams. Notebooks are easy for solo or very small teams, and the literate programming style benefits still apply in larger teams but there’s a lot of friction: “hey just %run this shared notebook with a bunch of useful utilities in it - oops yeah it tries to write some files because of some stuff unrelated to your use case in there (that’s essential to my use case)”
My current best that I know of is to keep “calculation” (pure) code in a .py and just the “action“ (side-effectful) code in the notebook. Then as far as physically possible, keep the data outside of notebook (usually a database or csv’s). That helps avoid the main time sink pitfalls (resolving git conflicts, versioning, testing etc) but it doesn’t solve for example tooling you might want to run - maybe mypy against that action code - sure you can use nbqa but… interested to learn better approaches.
[+] [-] ThouYS|1 year ago|reply
would be cool if marimo could "unroll" the compute graph into a standalone python script that doesn't need the marimo library
[+] [-] oivey|1 year ago|reply
[+] [-] energy123|1 year ago|reply
"Zasper ... provides ... exceptional speed".
If they can just make input latency indistinguishable from vim, that's a very worthwhile value add.
[+] [-] lf-non|1 year ago|reply
I would like to use this with xeus kernel for sql (which is also native) and if this reduces the resource consumption of that setup significantly, its a big plus for me.
[+] [-] em500|1 year ago|reply
[+] [-] DandyDev|1 year ago|reply
[+] [-] Galanwe|1 year ago|reply
- The UI is over bloated and bugged, sometimes things scroll, sometimes they don't, sometimes you have to refresh the page. You cannot easily change the UI as lots of CSS parts have hard coded fixed sizes.
- The settings are all over the place, from py files in ~/.jupyter to ini files to auto generated command line parameters.
- The overall architecture is monolithic and hard to break down, jupyter proxy is a good example of the hacks you have to go to to reuse parts of jupyter
- The front end technology (Lumino) is ad hoc and cannot be reused, I had to write my own react components basically reimplementing the whole protocol, come on its 2025.
- The whole automation around nbconvert is error prone and fragile
[+] [-] dist-epoch|1 year ago|reply
This is why I moved to working with Jupyter notebooks in VS Code, there is no server to manually start.
[+] [-] pjmlp|1 year ago|reply
Then all the performance improvements by using Go, are taken away by using Electron.
[+] [-] benatkin|1 year ago|reply
It's why Jupyter fits pretty well into VSCode/VSCodium.
> 5. Rendering your app
> Electron uses Chromium under the hood so your user sees the same on Windows, Linux and macOS. Tauri on the other hand uses the system webview: Edge Webview2 (Chromium) on Windows, WebKitGTK on Linux and WebKit on macOS. Now here comes the bad part, if you are a web developer you know that Safari (Based on WebKit) is always behind a step from every web browser. Just check out Can I Use. There is always a bug that you are not seeing from Chrome, only your dear Safari users. The same issues exists in Tauri, and you can't do anything against it, you have include polyfills. The winner has to be Electron here.
https://www.levminer.com/blog/tauri-vs-electron
[+] [-] low_tech_punk|1 year ago|reply
[+] [-] nbittich|1 year ago|reply
[+] [-] RossBencina|1 year ago|reply
[+] [-] bandrami|1 year ago|reply
[+] [-] cess11|1 year ago|reply
https://livebook.dev/
[+] [-] Flux159|1 year ago|reply
Would be interested to see where this goes.
[+] [-] gdevenyi|1 year ago|reply
[+] [-] JZL003|1 year ago|reply
As an example I love jupyterlab's "open console for notebook" but can't find a way of sending copied text to it, or switching focus with a keyboard shortcut
It's a big reason I can't do vscode Jupiters implementation
[+] [-] tylerhillery|1 year ago|reply
[+] [-] jampekka|1 year ago|reply
[+] [-] barrettondricka|1 year ago|reply
You are not showcasing anything, but looping low resolution screenshots with special effects.
[+] [-] 1121redblackgo|1 year ago|reply
[+] [-] v3ss0n|1 year ago|reply
[+] [-] dizhn|1 year ago|reply
[+] [-] lutusp|1 year ago|reply
This is not meant as criticism, just perspective. It's a classic development sequence:
This BTW is what happened to Sage, which grew over time and was eventually replaced by IPython, then Jupyter, then JupyterLab. Sage is now an installable JupyterLab kernel, as is Go, among many other languages, in an environment that's increasingly difficult to install and maintain.Hey -- just saying. Zasper might be clearly better and replace everything, in a process that mimics biological evolution. Can't leave without an XKCD reference: https://xkcd.com/927/
Again, not meant as criticism -- not at all.
[+] [-] RossBencina|1 year ago|reply
There is no such thing. There are Jupyter kernels. JupyterLab is just one of many UIs that speak the Jupyter protocol. Other examples include the original Jupyter notebook editor, VSCode Jupyter extension, and now Zasper.
I'm pretty sure Sage was always intended as a project that integrates the world, never "small footprint".
[+] [-] prirai|1 year ago|reply
[+] [-] starlite-5008|1 year ago|reply
[deleted]
[+] [-] waseemmalik|1 year ago|reply
[deleted]
[+] [-] v3ss0n|1 year ago|reply