How about this as a fix that retains the current design and functionality.
As well as including the fnid in the pagination links also include a pg=1...n item that is only used as a fall-back if the continuation has been garbage collected. That way you retain the continuation design that enables the user to see the list continued in the order that it was on the last page but if the continuation has been collected it takes the most current ordering and returns page n.
If I had the time this afternoon I would have a look though the code to see if this would work but unfortunately don't. Is there anyone here that is familiar with the code base who could asses if this is a simple change?
It doesn't reflect well on YC that the site is so embarrassing technically. I will now force a speedy fix by suggesting that the problem lies with Lisp.
I know you're kidding, I'm just not sure if you're half-kidding or whole-kidding.
Would YC be more successful if HN was technically improved? Certainly not. Would HN be more successful? Hard to say, but probably irrelevant -- HN is the social focus of the startup community. This has rewards for YC, but comes with taxes as well.
Regardless, there is no trend away from HN visible today, and the cobbler has other work.
:-) I don't think pg needs to prove that it's possible to explicitly carry state from request to request using Arc. You can do that in any language. Continuation passing is a cool hack, though I wouldn't use it myself on a production web site that needed to scale. And I do use Scheme.
This bug doesn't only happens to the homepage, but also to old threads, where when you click "More" at the bottom of the page to read the next page of comments, you get such error message instead. If you wait a bit, go back to the previous page, refresh and click on "More" again, it goes away, but it is certainly annoying.
There's no need to "wait for a bit". Just go back and reload the page in order that you have fresh fnids that stand a chance of still being in the cache when you click the link.
Problem? I thought this was a feature. It happens when the user does not interact with a page for some time.
I thought HN has it so that the users are forced to refresh the screen and get the latest news.
Am I wrong here?
Ha. Never thought of that. But no, it's not a feature in the sense of something that was designed with this goal in mind. It's a way of not having to design and implement a whole url scheme, and instead being able to easily run arbitrary code in response to a link being clicked. This architecture, however, has the undesirable (expect apparently to you) side-effect of occasional "unknown or expired link"s.
Couldn't pg just toss additional GB of RAM int the server and dedicate it to continuations cache? Just extending the time from few minutes to few hour or days should be enough.
It seems to stop the second page becoming stale, forcing you to refresh and get a new copy of the front page first preventing you from viewing outdated content.
While I agree it's a feature, I'm pretty sure it works the opposite way. The idea is that you want the second page to be "stale": if you've just spent time reading the front page, when you click to the second page you should be getting all new links.
This comes at a cost though, because the server must cache all the possible second pages that might be requested. So HN makes a compromise and caches them for an hour or so. If you wait an hour, it will have been evicted from the cache and the error will appear.
Yes, it is a feature, a feature that leads to a terrible user experience, so while it may not be a bug due to it being by design, its still a really big defect.
In my personal instance: I would be much more of a contributor to HN If I didn't have this problem on a daily basis.
Usually it happens after I have tabs open, go get some food, come back, and I need to start my browsing on HN all over again as a result, usually this only happens once, because I say fuck it and go elsewhere.
What are the pro's that make this con worth it? I'm not seeing it, its just a big massive pain in the ass.
No, it's a bug, a design flaw in using continuations. See my other comment. If it was by design then the links wouldn't continue to work for longer at quiet times on the site; they do simply because old continuations haven't been flushed to make room for new ones.
What about the app automagically saving the linked content in a cache and if the link ever goes dead, or content vanishes, to present the cached version to users?
I'm unclear what you're suggesting but your up-votes suggest I'm alone. :-) The fnid (function ID) is the index into the cache storing the continuations. It's this cache that's dropping `old' entries as new ones are added. Depending on how busy the site is, that takes a varying amount of time.
Keep in mind, as pg confirmed the other day, HN is still running on a single core.
pg/HN has always been good at prioritizing. (And at identifying his interests versus yours, which may often but not always overlap.) Also, there are ready workarounds, if you must, e.g. load the linked page of interest into a new tab before it expires.
This is something people liked about HN, including early on. That pg would make useful decisions and then not take / cave in to cr-p about them.
I'll live with the expiring links, if and as it makes other parts of managing HN easier.
P.S. As I reflect, is some of the increased discomfort and agitation from users on this point due to an increase in mobile browsing, where such user-initiated workarounds face a more cumbersome UI? (Not all, but some.)
It was a lot worse in that past but pg cut back on the number of fnids being generated, switching to more conventional methods. There's still quite a few around though and obviously increased traffic, meaning more new fnids to store, puts pressure on the cache.
He means Hacker News' next page link. The URL you get by clicking it is unique to you and it preserves order of stories when you navigate from one page to next, but it has short lifetime.
[+] [-] ralph|13 years ago|reply
(The poor formatting of sed is caused by the post being made dead; it triggered a re-submission of the content causing corruption.)
[+] [-] samwillis|13 years ago|reply
As well as including the fnid in the pagination links also include a pg=1...n item that is only used as a fall-back if the continuation has been garbage collected. That way you retain the continuation design that enables the user to see the list continued in the order that it was on the last page but if the continuation has been collected it takes the most current ordering and returns page n.
If I had the time this afternoon I would have a look though the code to see if this would work but unfortunately don't. Is there anyone here that is familiar with the code base who could asses if this is a simple change?
[+] [-] codex|13 years ago|reply
[+] [-] quesera|13 years ago|reply
Would YC be more successful if HN was technically improved? Certainly not. Would HN be more successful? Hard to say, but probably irrelevant -- HN is the social focus of the startup community. This has rewards for YC, but comes with taxes as well.
Regardless, there is no trend away from HN visible today, and the cobbler has other work.
[+] [-] brlewis|13 years ago|reply
[+] [-] unknown|13 years ago|reply
[deleted]
[+] [-] rcamera|13 years ago|reply
[+] [-] ralph|13 years ago|reply
[+] [-] sidcool|13 years ago|reply
[+] [-] vegashacker|13 years ago|reply
[+] [-] scotty79|13 years ago|reply
[+] [-] jpswade|13 years ago|reply
It seems to stop the second page becoming stale, forcing you to refresh and get a new copy of the front page first preventing you from viewing outdated content.
[+] [-] sirclueless|13 years ago|reply
This comes at a cost though, because the server must cache all the possible second pages that might be requested. So HN makes a compromise and caches them for an hour or so. If you wait an hour, it will have been evicted from the cache and the error will appear.
[+] [-] se85|13 years ago|reply
In my personal instance: I would be much more of a contributor to HN If I didn't have this problem on a daily basis.
Usually it happens after I have tabs open, go get some food, come back, and I need to start my browsing on HN all over again as a result, usually this only happens once, because I say fuck it and go elsewhere.
What are the pro's that make this con worth it? I'm not seeing it, its just a big massive pain in the ass.
[+] [-] ralph|13 years ago|reply
[+] [-] corkill|13 years ago|reply
[+] [-] iwwr|13 years ago|reply
[+] [-] ralph|13 years ago|reply
[+] [-] philh|13 years ago|reply
[+] [-] pasbesoin|13 years ago|reply
pg/HN has always been good at prioritizing. (And at identifying his interests versus yours, which may often but not always overlap.) Also, there are ready workarounds, if you must, e.g. load the linked page of interest into a new tab before it expires.
This is something people liked about HN, including early on. That pg would make useful decisions and then not take / cave in to cr-p about them.
I'll live with the expiring links, if and as it makes other parts of managing HN easier.
P.S. As I reflect, is some of the increased discomfort and agitation from users on this point due to an increase in mobile browsing, where such user-initiated workarounds face a more cumbersome UI? (Not all, but some.)
[+] [-] ralph|13 years ago|reply
[+] [-] thatusertwo|13 years ago|reply
[+] [-] anamax|13 years ago|reply
[+] [-] vkkan|13 years ago|reply
[+] [-] szajbus|13 years ago|reply