(no title)
borland | 2 years ago
That's a lot of very high hurdles to clear. Even if this magically solved all my scaling and distributed system problems forever, I'm not sure it'd be worth it. Good luck to them though for being ambitious.
borland | 2 years ago
That's a lot of very high hurdles to clear. Even if this magically solved all my scaling and distributed system problems forever, I'm not sure it'd be worth it. Good luck to them though for being ambitious.
pchiusano|2 years ago
This post https://www.unison.cloud/our-approach/ talks more about why such radical changes were necessary to achieve what we wanted. (In particular check out the "3 requirements of the dream" section, which walks through what the programming language needs to support to be able to do things like "deploy with a function call.")
My general take on "when and where to innovate" is: if you can get a 10x or more improvement in some important dimension by doing things differently, it can absolutely be worth it. This is the philosophy we've applied in developing Unison over the years. I am generally happy to learn something new if I know that I'll be getting something substantial out of it. Of course it can be hard to tell from the outside if the benefits really are worth the changes. I'm not sure what to say about that, other than try it out with something low risk and decide for yourself.
Besides the distributed programming / cloud stuff, I'll give a couple other examples where we gain advantages by doing things differently: by storing Unison code in a database, keyed by the hash of that code, we gain a perfect incremental compilation cache which is shared among all developers of a project. This is an absolutely WILD feature, but it's fantastic and hard to go back once you've experienced it. I am basically never waiting around for my code to compile - once code has been parsed and typechecked once, by anyone, it's not touched again until it's changed. This has saved me countless hours compared to other static languages. And I don't have to give up static typing to get this.
This sort of content-addressed caching also plays out for testing - for pure tests (which are deterministic), Unison has a test result cache keyed by the hash of the test code. This also saves countless hours - imagine never needing to rerun the same tests over and over when nothing's changed! (And having certainty that the cache invalidation is perfect so you don't need to do a "clean build just to be sure")
Also replied here re: self-hosting https://news.ycombinator.com/item?id=39293568
bloppe|2 years ago
1. "Deployment should be like calling a function" isn't that the mantra of serverless? e.g. GCP Cloud Run or AWS Lambda? This is also becoming much more streamlined with server-side WASM e.g. wasmCloud.
2. "Calling services should be easy" this is what protobuf is for; cross-language client libraries that handle transport, de-/serialization, native typing, etc.
3. "typed storage" isn't this basically an ORM? I suppose it's more general since it doesn't have to be relational, but ORM ideas could just as easily be adapted to JSON blob stores using something like protobuf.
Also, storing Unison code in a database, keyed by the hash of that code, sounds a lot like using Bazel with a shared remote cache.
I'm not saying Unison isn't cool, but to win me over I'd need you to compare Unison to all these existing technologies and really spell out what differentiates Unison and what makes it better.
mike_d|2 years ago
Programmers will happily hire a lawyer or a receptionist, but will code themselves into a fury and invent programming languages to avoid admitting they suck at ops and should hire someone.
Let's just call it what it is: the cloud is ego driven outsourcing. Nobody wants to admit they need an ops person, so they just pay for 1 millionth of an ops person every time someone visit their website.
dmix|2 years ago
> by storing Unison code in a database, keyed by the hash of that code, we gain a perfect incremental compilation cache which is shared among all developers of a project. This is an absolutely WILD feature, but it's fantastic and hard to go back once you've experienced it. I am basically never waiting around for my code to compile - once code has been parsed and typechecked once, by anyone, it's not touched again until it's changed.
Interesting. Whats it like upgrading and managing dependencies in that code? I'd assume it gets more complex when it's not just the Unison system but 3rd party plugins (stuff interacting with the OS or other libs).
adastra22|2 years ago
So… ccache?
jweir|2 years ago
This could lead some huge advantages, and new obstacles.
I played with the language for about a week and found it intriguing. And it seems to approach tackling Joe Armstrong's question "Why do we need modules at all?" -> https://erlang.org/pipermail/erlang-questions/2011-May/05876...
christophilus|2 years ago
paxys|2 years ago
parentheses|2 years ago
The odd thing is unison started purely as a language. Now there's a platform.
I'd love to hear some opinions from outside Unison about how they like using this language, tooling and hosting.
seagreen|2 years ago
I often find the best way to understand complex things is to dig all the way back to when they were being thought up. In this case there's a blog post from 2017 that I still find useful when thinking about Unison:
https://pchiusano.github.io/2017-01-20/why-not-haskell.html
Key quote:
Composability is destroyed at program boundaries, therefore extend these boundaries outward, until all the computational resources of civilization are joined in a single planetary-scale computer
(With the open sourcing of the language I doubt it will be one computer anymore, but it's an interesting window into the original idea)
Personally I find there's a lot to this. It's interesting that we're really, really good at composing code within a program. I can map, filter, loop and do whatever I want to nested data structures with complete type safety to my heart's content. My editor's autocompleting, docs are showing up on hover, it's easy to test, all's well.
But as soon as I want cron involved, and maybe a little state-- this is all wrecked. Also deployment gets more annoying as they talk about a lot.
So I think Unison always had to have a platform to support bringing this stuff into the language, even though they built the language first.
I'd love to hear some opinions from outside Unison about how they like using this language, tooling and hosting.
I'd like to hear this too.
Also, it would be great if there was something like https://eugenkiss.github.io/7guis/ or https://todomvc.com/ for platforms that we could use to compare Unison, AWS, etc etc. Or is there already a 7GUIs for platforms that I don't know about?
jcwilk|2 years ago
I actually found out about Unison because from my own side projects I came to the conclusion that strongly typed, hash-addressed functions were a super compelling approach to highly modular and maintainable programming - especially in the context of LLM-generated code because refactorings and new function generation require very limited context - something desirable for humans but especially for LLMs with limited context. After digging around for something that did this I found Unison and have now mostly abandoned my own tooling because Unison is so much more mature and has such competent people behind the wheel.
There is a learning curve for sure, not just with the tooling but also the language. It's a challenging language steeped in advanced software engineering principles, but I would 100% rather spend my time honing my fundamental understanding of my craft rather than learning another 20 AWS tools which are going to go out of style in 12 months. After becoming mildly proficient in Unison I feel like I have such a broader understanding of programming in general even though I've been a full time backend coder for 15+ years.
As for the tooling, it does what it needs to and does it well with very competent folks discussing and debating the minutia daily. It's a small team and that keeps them nimble with major improvements taking place each month.
Today I'd say that it excels at microservices, things you might otherwise consider a traditional serverless function for, but gives you way more agility and brevity to tweak the application in a surgical and controlled way which is more aligned with the behavior rather than text files. Something just feels very right about storing the AST as-is and manipulating it more directly.
Tomorrow, as more supporting libraries get built and interfaces to outside of Unison get developed, anything's possible really - I'm personally certain that we'll see some amount of continued shift towards making ASTs the source of truth so I see learning about it and following the software as an investment in myself and my future capabilities regardless of whether the future software ends up being Unison or something like it. Unison is going out of their way to do all the right things even if it's not always the most practical thing given the current corroded state of web engineering in general, so I'm eager to get in on that as much as possible.
joyfulcoder|2 years ago
I'm not affiliated with the company at all.
I built the start of a very basic site with Unison and HTMX.
https://cross-stitch-alphabet.netlify.app
In my day job I'm a Rails developer. I've been consistently frustrated at how few languages are truly composable and been getting increasingly disillusioned with mainstream languages.
So that's my context.
The not so great:
The language and principles are hard to learn. I've had to throw away what I already know about a lot of programming.
Coding inside ucm requires a very different mentality to how we build software.
The tooling is still early days and has many rough edges.
Performance is currently poor but will get much better shortly.
Abilities are incredible but demand the user to be very familiar with recursion.
Like many on here, I have lots of questions. It's not clear how migrations will work. I don't understand BTrees. If unison corp goes under what happens to my code?
Now for the good.
Unison is, hands down, the most radically joyful language I've ever used.
It's caused me to realise that most tools we use in software are faulty primitive compared with what they could be.
The fact is that even the benefits in the marketing of Unison are a scratching the surface of what's possible in this language.
For example, by spending a few hours I made the basics of an end to end testing library that emulates HTMX with local function calls.
This, if fleshed out, would mean the holy grail for me - fast cacheable end to end tests that do not require a browser to be spun up.
The possibilities are mind boggling.
I was utterly delighted with the deploy in a single function feature, something I'm now never going to be able to go back from having.
And deploying a database with schema in two lines is just jaw dropping.
Every time I use Rails now it's clear how much better our coding experience could be.
By building in Unison you get ports and adapters for free. Never have to wait for a test suite again. No infrastructure as code. No JSON. No yaml. Bliss.
In summary, it's radical. Would I run a production system on it yet? Nope.
Would I watch it keenly until it amasses a bit more momentum? You bet.
I believe whether unison succeeds or fails, this is the future of programming.
Oh and they're a delightful group of people to be around. The discord community has been beyond supportive to me whilst learning the language.
KPGv2|2 years ago
As a somewhat stay-at-home dad I was looking to do something fun with programming, and ideally something where I could make an early impact. About the same time, I read about Tree Sitter Grammars and was looking at Lapce, a Rust IDE in its early stages, and uses TSGs for syntax highlighting.
So I ended up learning everything about the Unison syntax, going so far as to learn me a Haskell for great good to produce the TSG for Unison.
About that time, Unison started opening up early testers of Cloud. I passed because I hadn't really written much code in Unison. I'd just been writing the TSG (in C++ and JS).
But then I picked a project: write an implementation of Philips Hue's bridge API in Unison. In the process, I learned about server-sent events and wrote a library for that and released it on Unison Share, which I think of as a mix of Github and NPM (or Maven, or pypi, etc.). I also wrote a MimeType typings library.
I can't speak to the cloud stuff yet, and when I do use it, I won't have much to compare it to because my ops exposure in my profession (as a stay at home dad, haha) is limited.
That's my background, and here are my thoughts:
First, the Local UI for browsing your code and documentation is hands down the best I've ever seen. Everything you write is browsable there, and has hyperlinks to everything else. It's so money. And there's a `Doc` type as well, so you can write something like
Then you can `add` and `Foo` will be stored in the current namespace, but so will `Foo.doc`, which is the content inside `{{ ... }}`. You can then delete this code from your scratch file and never think about it again.If you browse the Local UI (you type `ui` in your `ucm` instance and it auto-loads in a browser), you can easily view the type, the doc above it, and you can click `Nat` or `Int` to be taken to the definitions of these in the base library, located at `lib.base` (`lib` is like your dependencies, like `node_modules` in JS, e.g.).
Say you later want to add a third type parameter. `edit Foo` and the current definition will be pretty printed to your scratch file. Then you can edit it, and run `update`. Anything relying on this type that can be migrated to the new definition will, and anything that can't automatically be migrated will get dumped to your scratch file for you to update manually. Once you have no more errors in your scratch file, `update` will finish it.
This feels a lot like the process of `git rebase --continue` until everything is consistent. Except here it's the code itself that `ucm` understands, not text data that `git` doesn't understand beyond "this is text with conflicts."
From one `ucm` instance, I can switch between projects. No managing folders on my computer in `/Users/foo/workspace/foo-project`, etc.
Anyway, the long and short is that once I got used to working this way, I immediately wished this existed for TypeScript as well, bc that's what I do so much of my work in. The doc generation is incredible, the source browsing is so good, and the process of updating my code is really slick. A few versions ago, it was less so, but it's been improved since then and now I really like it.
Pushing code is as easy as `push`. You can create releases of your libraries or applications by going to Unison Share, finding your project, and navigating through the simple "cut a release" wizard.
There are even types for License, CopyrightHolder, etc. so metadata about your application can be done in code. For example, the license for my mimeType library is
The type for `License` is defined as `License [CopyrightHolder] [Year] LicenseType. There are pre-configured license types in the base library, and `mit` is one of them.I find this to be a nice addition as well, although for many this is something to be ignored. But I like the idea of encapsulating so much of a project in code rather than in things like a `package.json` file that is brittle.
The one other thing I'd like to mention is abilities. It was hard to wrap my head around them at first. I'm really familiar with monadic programming. My coworkers might say I'm too in love with it :) Wrapping my head around abilities was hard at first. They're kind of like...monads, DI, and interfaces all sort of mixed together. But there's essentially two components: the ability, and the ability handler. The ability is like you defining an interface. Any handler needs to know how to handle any of the "requirements" of the ability. For example, you might write an application that communicates with an api for example.com as
You're somewhat defining an interface that handlers need to conform to (i.e., they must have code that handles each of the "ability requirements"). Your handler then essentially converts your abilities to more fundamental ones, or removes them completely. You're generally working your way, in an application, down to only IO and Exception abilities (there's probably some cloud abilities I am not familiar with), which UCM handles natively.Your handler is like the implementation of an interface.
From there, you can write code using anything in that ability, and so long as some ancestor function call wraps all that in a handler, everything just works. It kind of acts like injecting your handler as a dependency of everything that is a descendant function of the handler.
I don't know if I'm effectively communicating how this works, but it makes sense for me. Those are the analogues I'm familiar with that I used to understand the ability system.
Now that I feel comfortable with it, it's pretty cool!
Edit: My final thoughts is that the language is really nice to use (though there are some things from TypeScript I miss, they're very few, and it's certainly superior to something like Java IME). It's nice the VCS and documentation/code browser is built in. Being able to push to a repo, again built into the ucm program, is convenient. Everything is wrapped up nicely. And the company behind the language is extremely online and repsonsive. I've gotten so much help from them. I wish I could speak to the cloud offerings, but I haven't worked with it yet.
zawodnaya|2 years ago
__MatrixMan__|2 years ago
Tinkering on the weekends just isn't the same.