Hupo | 13 years ago | on: PeerJS — True peer-to-peer data in the browser
Hupo's comments
Hupo | 13 years ago | on: JavaScript is the new Perl
+[[+!+[[]]]+[![]][+[[]+[]]]++] === 10
This evaluates to true. Cookie for anyone who can explain why!Hupo | 13 years ago | on: Shapecatcher: Draw the Unicode character you want
But that aside, this looks like a neat idea. Not something I have any immediate use for myself, but could certainly be useful in some situations.
[1] http://www.fileformat.info/info/unicode/char/2603/index.htm
Hupo | 13 years ago | on: Our First Node.js App: Backbone on the Client and Server
>I think it would be fun to play around with a Node-based framework that is based around this inverted publishing model, instead of the usual serving one. The default would be to bake out static resources when data changes, and you'd want to automatically track all of the data flows and dependencies within the application. So when your user submits a change, or your cron picks up new data from the FEC, or when your editor hits "publish", all of the bits that need to be updated get generated right then.
You mean most things don't already do this? I've been working on a personal blog engine with this as one of the core ideas (basically all static assets and pages are compiled on edit), and I thought it was a pretty obvious way to go about it. Looks like I'm indeed not the only one to think of it, but how "new" you present the idea as is a bit surprising to me.
Hupo | 13 years ago | on: Amazon Elastic Transcoder
MeGUI is hardly necessary - x264 has a good set of presets and tunes built in to begin with. --preset veryslow --tune film/animation/grain will already get you very far, beyond that pretty much the two most important things to possibly tweak are the strengths of AQ and psychovisual optimizations (--aq-strength and --psy-rd).
>it is totally within the realm of possibility to put two hours of 1080p content on a single-layer DVD (4.4GB), in a format compatible with any Blu-Ray player out there (AVCHD, a subset of the Blu-Ray standard that accepts DVD as the storage layer), while keeping video quality at a very high level - basically indistinguishable from commercial Blu-Ray discs.
You might get away with an hour of almost-transparent content if it's not particularly bitrate-demanding, but two hours of live action will not look "indistinguishable from commercial Blu-ray discs". 5 Mbps High Profile L4.0 H.264 just won't look as good as ~30-40 Mbps H.264 High Profile L4.1 H.264 commonly found on BDs (unless the BD is really screwed up). At 720p you'd get pretty good results, though.
Hupo | 13 years ago | on: Amazon Elastic Transcoder
Hupo | 13 years ago | on: Amazon Elastic Transcoder
Also, even for a "simple" cloud transcoding service, Amazon's offering is pretty limited in what it can do right now[2] - you can basically only encode H.264 & AAC in MP4, define the profile, level and bitrate, and that's about it. Zencoder has much more options in comparison and has generally more transparency in regards to what their encoding software actually does (sadly when I asked them about getting access to x264 settings directly, they replied along the lines of "they could change and things might break for users!" - which I don't think would be an actual issue since the direct settings ought to be for advanced users only, and they should be aware of things changing - plus Zencoder could just notify users of direct settings before they upgrade so they have time to adjust their settings if necessary).
[1] https://app.zencoder.com/docs/api/encoding/h264/tuning
[2] http://docs.aws.amazon.com/elastictranscoder/latest/develope...
Hupo | 13 years ago | on: In Love with LÖVE
Hupo | 13 years ago | on: VP8 vs H.264 – Which One is Better?
Anyway, I brought up the subject to some Xiph folks over at IRC. Maybe in the future the test clips will come equipped with more detailed information to help in testing. It'd also benefit smaller scale tests, since it'd allow one to identify possible biases more easily.
Hupo | 13 years ago | on: VP8 vs H.264 – Which One is Better?
But it wasn't. I was comparing the visual quality of the whole video, and provided the full encoded clips for people to download and compare for that reason.
I am willing to do further test encodes, but have no interest in doing something like encoding all 28 HD test clips available on derf's test clip page[1], since as a purely visual comparison, especially with the actual encodes, it would be incredibly exhausting.
EDIT: I added a notice about the downsides of single clip comparison to the top of the post.
Hupo | 13 years ago | on: VP8 vs H.264 – Which One is Better?
I chose the clip based on what Dark_Shikari (x264 developer) had to say about it[1]:
It shouldn't bias too heavily towards any one encoder like many of the other standard test clips will:
a. It's relatively high motion, so it won't bias heavily against encoders without B-frames or qpel (as, say, mobcal does).
b. It's not so high motion that it would cripple video formats that don't support motion vectors longer than 16 pixels (e.g. Theora).
c. It's not something that benefits an unreasonably large amount from some of x264's algorithms (which is why I picked this and not parkrun).
[1] http://forum.doom9.org/showthread.php?t=154430
I could have done multiple test encodes, sure, but the problem in this case was that downloading several gigabytes of raw source material isn't exactly instant. And even if I tested with multiple clips, I doubt the conclusion would be that much different.
Hupo | 13 years ago | on: VP8 vs H.264 – Which One is Better?
Hupo | 13 years ago | on: VP8 vs H.264 – Which One is Better?
Hupo | 13 years ago | on: H.265 Is Approved
Hupo | 13 years ago | on: H.265 Is Approved
I'm sorry, but you have been mislead - VP8 is not better than H.264, and comparison you linked is bad for multiple reasons, like not telling the exact encoder settings used, not providing actual video for users to compare and only showing one frame (for all we know, it might be a keyframe on VP8 and it pumped the bitrate up for it and x264 didn't), not providing source for test replication and so on - just read these:
http://x264dev.multimedia.cx/archives/458
http://x264dev.multimedia.cx/archives/472
I can do a proper comparison between H.264 and VP8 if you or anyone else is interested, though it'll take at least a few hours (I intend to use the park_joy test clip found in derf's test clip collection[1]).
Also, On2 is famous for hyping up their products to heaven and have yet to match their claims, so I remain skeptical about VP9. There's also Xiph working on Daala, but right now it doesn't seem to be much beyond big words.
While I would love an open format to provide better quality than H.264 and even H.265, I wouldn't hold my breath for such a thing.
Hupo | 13 years ago | on: If You Call Out Bad Code, Make Sure It's Bad First
And while this was mentioned a few times in the previous thread, it bears repeating again: There are quite a lot of people out there using Windows, which includes developers. Node is a first-class citizen on Windows, and grep, sed and friends won't be there out of the box for you. A properly done Node command-line utility is generally more cross-platform than a shell script using grep and sed would. And there's quite a bit of command-line utilities made with Node out there (most revolve around web development, such as build tools like grunt and things like CoffeeScript/TypeScript/etc compilers), so it's not like this one is unique in that regard either.
Hupo | 13 years ago | on: Mega has launched
True, and it seems that Mega's copyright infringement reporting page[1] has an option for that:
Takedown Type: Remove all underlying file(s) of the supplied URL(s) - there is no user who is permitted to store this under any circumstance worldwide
Hupo | 13 years ago | on: Mega has launched
The thing with copyrighted content, though, is that even if the file you're checking might be infringing on copyrights in certain cases, in other cases it might as well be completely legit. I wrote about this on some earlier MU submission[1], so I won't repeat all that here, but all in all, even if you knew that file X existed on Mega's servers, it would be pretty damn haphazard to just outright delete it, because you might be hurting many legitimate users by doing so.
Anyway, I think Mega could secure user's files simply by encrypting the locator keys they have with the user's own key, and this data only gets decrypted and parsed client-side when the user uses Mega with the user's own key. This way you could only prove that a file exists on Mega's servers, but had no way to check which user(s) it belongs to without cracking the individual user data one by one. And of course, if you don't have any exact files to check against Mega, then you wouldn't be able to even figure out whether "content X" is hosted there somewhere, and neither could Mega (since they'd naturally only store locator hashes and encrypted data itself).
Hupo | 13 years ago | on: HTML-based (mobile) weather app
Hupo | 13 years ago | on: The Sorry State of Native App Typography Licensing
[1] http://www.fontsquirrel.com/
[2] http://www.fontsquirrel.com/fontface/generator
[3] http://www.google.com/webfonts
Also, while I'm at it, I guess I could name some typefaces I've grown quite fond of: Alegreya, Aller, Cabin, Delicious, Fontin Sans, Lato, Open Sans, PT Sans, Puritan, Quattrocento Sans, Rosario, Source Sans Pro, Ubuntu. All great stuff!