So two big design issues, if you scroll down from the wheel of terms you go wobbly, clicking the down arrow below that last wheel, also wobbly. If you use on of the tabs to go to a graph you create a 'back loop' where the back button takes you to the start, then back again takes you back to the graph, then back to the start then back to the graph ....
The data is interesting but I expect to see a bit of analysis in projects like this. Without it the project becomes a sort of data Rorschach test where the viewer projects their perceptions into it.
We are aware of a lot of the issues on the home page and plan to completely redesign it soon. However which graph are you talking about? If it's the post tracking we will definitely look into it.
> "Dogs" occurs more times in titles in r/aww than "cats" or "kittens"
That can be a fact.
> Despite their internet popularity cats are not submitted nearly as many times on this cuddly SubReddit.
Or dogs are rare enough that they're worth naming, because cats are default. Seriously, run whatever calculations you want, but be careful about what conclusions you draw from the numbers.
Nothing like a bunch of comments tearing apart a website built by people who, presumably, learned to code a few months ago and threw it together over the course of a few weeks. Yeah, it has a few issues, but it's significantly better than the first things I ever built and sent off to the world.
It's fundamentally horrible. The whole design approach is horrible. It's the contemporary equivalent of GeoCities, just using more recent but equally mindless design cliches.
In fact, it's so horrible it makes PowerPoint look good.
Site was borderline unusable with Chrome on android. Text was overlapping, animations to display new text were being triggered after I scrolled past them and in some cases the text just flew in one side and right off the other side of the screen before I even got a glimpse of it.
My prior on that is extremely low, so my guess is that's like, of the posts that make front-page? Or of the top posts. What this really reflects, therefore, is something more like how up-votey people are.
This is great and all, but if there is one tool I'd like to see is a personal data exporter.
I want to get all the posts and messages I have ever posted, going back many years, and beyond the 1000 cutoff in their comment history feeds. I'd like to run some keyword analysis on my own data, search it, access it however I see fit. As it stands now, there is no way to retrieve it.
How much of "the Reddit" did you actually use to calculate these stats? Is this based off a one-time snapshot from the API (which limits to 1000 posts, per type of query) or is it a longitudinal crawl (and how data was produced in that sample)?
For how long did you collect data? During previous crawls, I've found that one of my spider bots can scrape through about 11.5k submissions and 51k comments per day (while observing Reddit's API access rules).
This looks awesome. Next step is to make it "real-time."
I worked for an analytics company and got to build some pretty awesome visualizations using D3 and one of the problems I always ran into was that while the visualizations are cool, you rarely get any actionable information from the charts. I feel like this would be a lot better if at the end, there was some call to action.
Real time analytics is tricky due to API limits. (unless you can accept a "real time" of minutes/hours per update)
Example: Twitter's search API is limited to 15 queries of 100 Tweets every 15 minutes. Do you query 100 Tweets every minute, or 1500 Tweets every 15 minutes?
I've got an experimental, realtime comment search engine over at commentfindder.com. Perhaps some useful visualizations could be created off of it. Do you have any ideas for what kind of actionable chart would be interesting?
[+] [-] ChuckMcM|12 years ago|reply
The data is interesting but I expect to see a bit of analysis in projects like this. Without it the project becomes a sort of data Rorschach test where the viewer projects their perceptions into it.
[+] [-] gdi2290|12 years ago|reply
[+] [-] malcolmmcc|12 years ago|reply
That can be a fact.
> Despite their internet popularity cats are not submitted nearly as many times on this cuddly SubReddit.
Or dogs are rare enough that they're worth naming, because cats are default. Seriously, run whatever calculations you want, but be careful about what conclusions you draw from the numbers.
[+] [-] AjithAntony|12 years ago|reply
[+] [-] gdi2290|12 years ago|reply
[+] [-] ctide|12 years ago|reply
Keep it classy, HN.
[+] [-] ineedtosleep|12 years ago|reply
[+] [-] freehunter|12 years ago|reply
[+] [-] scholia|12 years ago|reply
In fact, it's so horrible it makes PowerPoint look good.
[+] [-] Tomdarkness|12 years ago|reply
[+] [-] gdi2290|12 years ago|reply
[+] [-] oliverhunt|12 years ago|reply
Is this saying that all posts in r/technology get an average of 2027 karma?
[+] [-] malcolmmcc|12 years ago|reply
[+] [-] joshu|12 years ago|reply
[+] [-] visarga|12 years ago|reply
I want to get all the posts and messages I have ever posted, going back many years, and beyond the 1000 cutoff in their comment history feeds. I'd like to run some keyword analysis on my own data, search it, access it however I see fit. As it stands now, there is no way to retrieve it.
[+] [-] gdi2290|12 years ago|reply
[+] [-] alexleavitt|12 years ago|reply
[+] [-] gdi2290|12 years ago|reply
[+] [-] zoba|12 years ago|reply
[+] [-] gdi2290|12 years ago|reply
[+] [-] frakkingcylons|12 years ago|reply
[+] [-] unknown|12 years ago|reply
[deleted]
[+] [-] joeblau|12 years ago|reply
I worked for an analytics company and got to build some pretty awesome visualizations using D3 and one of the problems I always ran into was that while the visualizations are cool, you rarely get any actionable information from the charts. I feel like this would be a lot better if at the end, there was some call to action.
[+] [-] minimaxir|12 years ago|reply
Example: Twitter's search API is limited to 15 queries of 100 Tweets every 15 minutes. Do you query 100 Tweets every minute, or 1500 Tweets every 15 minutes?
[+] [-] babs474|12 years ago|reply
[+] [-] gdi2290|12 years ago|reply
[+] [-] robg|12 years ago|reply
[+] [-] shaohua|12 years ago|reply
[+] [-] minimaxir|12 years ago|reply
[+] [-] unknown|12 years ago|reply
[deleted]
[+] [-] dbpokorny|12 years ago|reply
[deleted]