top | item 41736429

炊紙(kashikishi) is a text editor that utilizes GPU to edit text in a 3D space

240 points| hiroshi3110 | 1 year ago |github.com

101 comments

order

downvotetruth|1 year ago

So this does not get flagged: https://github-com.translate.goog/mitoma/kashiki2?_x_tr_sl=a...

Also, hotkey(s) for a set of predefined isomorphic camera views would seem useful; maybe I am not seeing it? https://github.com/mitoma/kashiki2/blob/main/kashikishi/asse...

coder543|1 year ago

Translation is difficult at the best of times. I thought it was interesting how Google Translate seemingly kept coming up with different translations for the name of the program. Under “Features”, it suddenly decides the name is “Takigami”, as one example. By the end, it even goes so far as to say: “When you start up the cooking program, the following screen will be displayed.”

I asked ChatGPT 4o to translate the README: https://chatgpt.com/share/6700bed9-1198-8004-8eed-07f5055d07...

The translation seemed largely consistent with what Google Translate provided, but some of ChatGPT’s translation differences seemed more plausible to me, and it certainly reads more coherently. It also doesn’t keep forgetting that it’s dealing with the proper name of the program.

I didn’t try Gemini for this, but I imagine it has to be decent at translation too, so I wonder if/when Google will use Gemini to assist, replace, or otherwise complement Google Translate.

Rendello|1 year ago

Japanese is my favourite written language, I love English but I'm definitely jealous of the beautiful glyphs and the vertical writing. From what I've seen, vertical writing is often poorly supported in software though, which is a shame.

joshdavham|1 year ago

Haha can you read Japanese though? It's beautiful for sure and it even feels a little different when reading it as if you're, in a way, sorta sounding about pictures. But man is it a pain in the butt to learn!

kibwen|1 year ago

English had beautiful writing, but it was destroyed by technology. First by the printing press, then by typewriters, then by low-resolution computer monitors. All of the human character and calligraphic qualities of the script have been mechanically stripped away in order to better accommodate what are now outdated legacy technologies, but everyone is so used to the status quo that we don't even realize what we've lost, and instead just accept that English script happens to be uglier than Japanese or Arabic or Devangari. In an alternate universe, we could be reading this in a script reminiscent of, say, the Uncial script used in the Book of Kells (which is what inspired Tolkien's beautiful Tengwar script).

timeon|1 year ago

> I love English

Why English in particular? English uses Latin script pretty randomly.

-- sincerely your Ptoughneigh

shannifin|1 year ago

Would love something like this in VS Code so I could smoothly zoom in and out of my code rather than scrolling and clicking tabs.

schainks|1 year ago

Say more! How does the existing zoom function not do enough?

amjoshuamichael|1 year ago

THIS! So much this!!

With the jump from 2D screen to AR-based UI, we have the chance to re-think all of the conventions that have gripped UI/UX design over the past few decades. How many apps would benefit from being able to visualize data in a 3D space? How many new ways could we interact with computers, if we could reach out and touch things? Text editing, video editing, image editing (visualizing Photoshop layers?), 3D modeling, sketching, gaming: all revolutionized by a new input paradigm. That's partially what I thought Apple would accomplish. They have a history of totally rethinking every part of software when a new input device comes around. I mean, think about the jump from the iMac to the iPhone. ["I just take my finger, and I scroll."](https://www.youtube.com/watch?v=FSv5x3V_KHY) I shudder to think how many drugs Apple employees had to take in order to think around traditional desktop conventions and come up with this stuff. I figured with the Vision Pro, we'd see traditional apps reformed to a new, never-before-seen standard, but I have unfortunately seen very little of that. If you scrape off all of the high-budget polish, Vision Pro feels like a device that another company would create that Apple would then do correctly. By extension, the Meta Quest lineup feels the same way.

But this is the kind of thing I absolutely want to see more of. There's a physicality to this text editor that feels intuitive, but more importantly, it feels comforting. When things appear and disappear on screens instantaneously without any animation, it triggers our brains that something is wrong, because that's unusual behavior. There's a purpose for animation, it's not always all for show. Bringing physicality like this to a 3D interface in mixed reality is, in my opinion, the next step in UI design. This text editor isn't getting super crazy with its effects, but in my opinion, you can already see the potential. As these devices come down in price and more developers get their hands on them I hope to see more like this. Hell, seeing this is the closest I've ever gotten to splurging on a Meta Quest so I could whip up a 3D modal text editor. I want a digital kitchen timer I can physically wind and unwind for Pomodoro timing. I want to pick an album to listen to on Apple Music from a stack of records projected onto my floor. Impractical? Perhaps. But look at early skeuomorphic iPhone apps and tell me those are practical. If all we cared about was using computers to get from point A to point B, we'd all work in TUIs, and r/unixporn wouldn't exist.

I don't know what it is, but feel a fundamental lack of interest in this new input paradigm, both from companies like Apple & Meta and from developers. Hopefully open source projects like this will show people the real potential of this new hardware.

amelius|1 year ago

If you're going that far, why not build something for a VR/AR setup?

tikimcfee|1 year ago

If you've got an iPhone or iPad, I've got an AR prototype for glyph based rendering like this you might enjoy playing with: https://github.com/tikimcfee/LookAtThat

Alpha release builds an AR app to pull code, render in space.

zelphirkalt|1 year ago

"to cook paper"? That's at least what it seems to translate to.

tempodox|1 year ago

I'm not familiar with Japanese script systems, but this looks fantastic.

panza|1 year ago

"Sakishi is based on Emacs key bindings. There is a very deep and logical reason for this, but for the purposes of this document, I will just state it as my preference."

I genuinely love Emacs people.

ahartmetz|1 year ago

I once set out to learn Emacs, but the stupidly multi-key bindings for very basic things (Ctrl-X Ctrl-C to copy, etc) turned me off. Deep and logical my ass.

joshdavham|1 year ago

Checking out this repo just made me realize that a good way to prevent getting spammed pull requests in your repo is to maintain it in a langauge other than English.

esperent|1 year ago

I've worked for maybe ten years in the open source space, as a maintainer and contributor. I've seen thousands of pull requests in that time. Some great. Most ok. Some terrible. All made by people who want to contribute in some way for free.

I've seen close to zero spam pull requests. Are these common?

Kwpolska|1 year ago

It’s also a good way to prevent getting useful pull requests.

retrac|1 year ago

Japan is somewhat a world of its own when it comes to open source projects. (And also commercial software, I suppose.) Mostly due to the language barrier. There's some wonderful stuff out there that's unfortunately only documented in Japanese. Even the Ruby language took about five years after becoming popular, to cross the language barrier and become used outside Japan.

dheera|1 year ago

At least it's better than getting spammed issues.

growt|1 year ago

"Text editing is still plagued by poor user interfaces: when you press the "A" key, the letter "A" appears on the screen without any interaction, when you press "Delete" it disappears in an instant, the cursor disappears to the right edge of the screen, then suddenly appears on the left edge" Maybe I'm old fashioned, but this is exactly what appeals to me regarding text based interfaces.

pjc50|1 year ago

From the translated page:

"Although Japanese is primarily written vertically, there are not many text editors that fully support vertical writing. However, you can gain deeper insight by reading a text vertically or horizontally, or by flexibly changing the layout and rereading it. With Sakishi, you can instantly switch between vertical and horizontal writing while editing a document."

That was my "aha!" moment when reading this. Japanese has been made to fit Western convention a lot of the time, but it's good to have another option.

soraminazuki|1 year ago

The author is probably half-joking in that part of the README. There's a slight hyperbolic tone in the original text that's lost in the translation.

Vampiero|1 year ago

Always remember who your target audience is. If it's people who actually use a computer for writing, instead of just looking at cat pictures on the internet, then they probably don't care about fancy transitions and animations. They want to get shit done and they need their workflow to be optimized.

If it's just for normal people, then go wild with all the useless, CPU-wasting frills. Feel proud about it, even. In 2024 a snappy user interface only requires a few GBs of RAM, several intercommunicating processes and the ENTIRE FREAKING WEB STACK.