Hupo's comments

Hupo | 13 years ago | on: PeerJS — True peer-to-peer data in the browser

Looks like a nice little library, and very relevant to my interests. My personal "dream scenario" is to have true multisource P2P video streaming in the browser with no extensions required - with WebRTC, this seems quite feasible as implementations mature.

Hupo | 13 years ago | on: JavaScript is the new Perl

The fun thing about JavaScript's type coercion and concatenation is that you can use it to create fun brain benders like this:

  +[[+!+[[]]]+[![]][+[[]+[]]]++] === 10
This evaluates to true. Cookie for anyone who can explain why!

Hupo | 13 years ago | on: Our First Node.js App: Backbone on the Client and Server

>But you know when the data is changing -- when an article has been updated and republished ... or when you've done another load of the government dataset that's powering your visualization. Waiting for a user request to come in and then caching your response to that (while hoping that the thundering herd doesn't knock you over first) is backwards, right?

>I think it would be fun to play around with a Node-based framework that is based around this inverted publishing model, instead of the usual serving one. The default would be to bake out static resources when data changes, and you'd want to automatically track all of the data flows and dependencies within the application. So when your user submits a change, or your cron picks up new data from the FEC, or when your editor hits "publish", all of the bits that need to be updated get generated right then.

You mean most things don't already do this? I've been working on a personal blog engine with this as one of the core ideas (basically all static assets and pages are compiled on edit), and I thought it was a pretty obvious way to go about it. Looks like I'm indeed not the only one to think of it, but how "new" you present the idea as is a bit surprising to me.

Hupo | 13 years ago | on: Amazon Elastic Transcoder

>MeGUI profiles

MeGUI is hardly necessary - x264 has a good set of presets and tunes built in to begin with. --preset veryslow --tune film/animation/grain will already get you very far, beyond that pretty much the two most important things to possibly tweak are the strengths of AQ and psychovisual optimizations (--aq-strength and --psy-rd).

>it is totally within the realm of possibility to put two hours of 1080p content on a single-layer DVD (4.4GB), in a format compatible with any Blu-Ray player out there (AVCHD, a subset of the Blu-Ray standard that accepts DVD as the storage layer), while keeping video quality at a very high level - basically indistinguishable from commercial Blu-Ray discs.

You might get away with an hour of almost-transparent content if it's not particularly bitrate-demanding, but two hours of live action will not look "indistinguishable from commercial Blu-ray discs". 5 Mbps High Profile L4.0 H.264 just won't look as good as ~30-40 Mbps H.264 High Profile L4.1 H.264 commonly found on BDs (unless the BD is really screwed up). At 720p you'd get pretty good results, though.

Hupo | 13 years ago | on: Amazon Elastic Transcoder

The thing that annoys me about both this and Zencoder is that for people who are actually experienced with video encoding, there is absolutely no way to tweak eg. the underlying x264 settings. There's quite a few settings that have no effect on decoding in any way but are pretty important in getting the most out of the video at a given bitrate (most notably the strength/mode of AQ and psychovisual optimizations). In case of AWS, there doesn't even seem to be any kind of "general" tuning (like whether the content is film, animation, extremely grainy or so - x264 has --tune settings for these among others - Zencoder at least allows you to access this option[1]) options available, making it pretty much "one size fits all". I could always rent a generic server and use that for my encoding needs, but it'd be much more convenient if these cloud transcoding services simply offered advanced configuration for people who know what they are doing.

Also, even for a "simple" cloud transcoding service, Amazon's offering is pretty limited in what it can do right now[2] - you can basically only encode H.264 & AAC in MP4, define the profile, level and bitrate, and that's about it. Zencoder has much more options in comparison and has generally more transparency in regards to what their encoding software actually does (sadly when I asked them about getting access to x264 settings directly, they replied along the lines of "they could change and things might break for users!" - which I don't think would be an actual issue since the direct settings ought to be for advanced users only, and they should be aware of things changing - plus Zencoder could just notify users of direct settings before they upgrade so they have time to adjust their settings if necessary).

[1] https://app.zencoder.com/docs/api/encoding/h264/tuning

[2] http://docs.aws.amazon.com/elastictranscoder/latest/develope...

Hupo | 13 years ago | on: In Love with LÖVE

This isn't directly related to the blog post as it's about stuff available on OS X, but if you're on Windows I can very much recommend checking out Construct 2 by Scirra[1]. They have a pretty fantastic HTML5 game engine and an editor (and the whole "no programming required" is really just marketing - even though it uses a "visual" event system you still need to understand programming concepts like loops, conditions and such in order to make effective use of it). They have a feature-limited (no other limits though) free edition available for it too.

[1] http://scirra.com

Hupo | 13 years ago | on: VP8 vs H.264 – Which One is Better?

One problem is that even if I found that VP8 performed very well at one or two particular clips (out of the 28 HD test clips available), I couldn't say for sure why that is the case. There seems to be no clear information on what clips benefit from what kind of features, and as I'm not an expert on video encoding technology, it'd be hard for me to deduce these things by myself. General conclusions could still be reached, obviously, but if I was going to such lengths it'd suck if I couldn't get more overall detailed results.

Anyway, I brought up the subject to some Xiph folks over at IRC. Maybe in the future the test clips will come equipped with more detailed information to help in testing. It'd also benefit smaller scale tests, since it'd allow one to identify possible biases more easily.

Hupo | 13 years ago | on: VP8 vs H.264 – Which One is Better?

>If your test was to compare an intraframe between vp8 / baseline h264 / and Theora, you would have concluded Theora was the best by a wide margin.

But it wasn't. I was comparing the visual quality of the whole video, and provided the full encoded clips for people to download and compare for that reason.

I am willing to do further test encodes, but have no interest in doing something like encoding all 28 HD test clips available on derf's test clip page[1], since as a purely visual comparison, especially with the actual encodes, it would be incredibly exhausting.

EDIT: I added a notice about the downsides of single clip comparison to the top of the post.

[1] http://media.xiph.org/video/derf/

Hupo | 13 years ago | on: VP8 vs H.264 – Which One is Better?

>not very representative clip

I chose the clip based on what Dark_Shikari (x264 developer) had to say about it[1]:

It shouldn't bias too heavily towards any one encoder like many of the other standard test clips will:

a. It's relatively high motion, so it won't bias heavily against encoders without B-frames or qpel (as, say, mobcal does).

b. It's not so high motion that it would cripple video formats that don't support motion vectors longer than 16 pixels (e.g. Theora).

c. It's not something that benefits an unreasonably large amount from some of x264's algorithms (which is why I picked this and not parkrun).

[1] http://forum.doom9.org/showthread.php?t=154430

I could have done multiple test encodes, sure, but the problem in this case was that downloading several gigabytes of raw source material isn't exactly instant. And even if I tested with multiple clips, I doubt the conclusion would be that much different.

Hupo | 13 years ago | on: VP8 vs H.264 – Which One is Better?

H.264 isn't exactly "closed" per se - the standard itself is freely available[1], which is why we have great free and open source encoders and decoders for it. What you mean by "closed" is most likely just "patent-encumbered", which it most certainly is and which affects anyone wanting to use it commercially (at the moment you can use it freely for non-commercial purposes on the internet, but this may or may not change in the future).

[1] http://www.itu.int/rec/T-REC-H.264-201201-I/en

Hupo | 13 years ago | on: VP8 vs H.264 – Which One is Better?

That he has. My test clip choice was inspired by him[1], and I also link to another blog post of his[2] in the conclusion. That VP8 blog post is almost three years old now, though, and the comment I'm replying to in my article claimed that VP8 is better than H.264 "at this point". This is why I did my test with the latest and greatest encoders available for both formats today.

[1] http://forum.doom9.org/showthread.php?t=154430

[2] http://x264dev.multimedia.cx/archives/472

Hupo | 13 years ago | on: H.265 Is Approved

>VP8 (which is also slightly better than h.264 at this point)

I'm sorry, but you have been mislead - VP8 is not better than H.264, and comparison you linked is bad for multiple reasons, like not telling the exact encoder settings used, not providing actual video for users to compare and only showing one frame (for all we know, it might be a keyframe on VP8 and it pumped the bitrate up for it and x264 didn't), not providing source for test replication and so on - just read these:

http://x264dev.multimedia.cx/archives/458

http://x264dev.multimedia.cx/archives/472

I can do a proper comparison between H.264 and VP8 if you or anyone else is interested, though it'll take at least a few hours (I intend to use the park_joy test clip found in derf's test clip collection[1]).

Also, On2 is famous for hyping up their products to heaven and have yet to match their claims, so I remain skeptical about VP9. There's also Xiph working on Daala, but right now it doesn't seem to be much beyond big words.

While I would love an open format to provide better quality than H.264 and even H.265, I wouldn't hold my breath for such a thing.

[1] http://media.xiph.org/video/derf/

Hupo | 13 years ago | on: If You Call Out Bad Code, Make Sure It's Bad First

The way I see it, writing command-line utilities in Node is really no different than writing command-line utilities in Python. Both offer a nice scripting environment, and while Node is more web-oriented, there's still quite a bit of libraries out there to do stuff with, making it potentially perfectly suitable for writing an utility for doing X. Hell, I personally even consider it somewhat more attractive than Python in this regard because Node has no equivalent to the 2.X/3.X compatibility mess that Python has. (Python wins in having more "general purpose" libraries, though, making it potentially more suitable for not-directly-web-related-development in nature.)

And while this was mentioned a few times in the previous thread, it bears repeating again: There are quite a lot of people out there using Windows, which includes developers. Node is a first-class citizen on Windows, and grep, sed and friends won't be there out of the box for you. A properly done Node command-line utility is generally more cross-platform than a shell script using grep and sed would. And there's quite a bit of command-line utilities made with Node out there (most revolve around web development, such as build tools like grunt and things like CoffeeScript/TypeScript/etc compilers), so it's not like this one is unique in that regard either.

Hupo | 13 years ago | on: Mega has launched

>There'll be plenty of cases when the content is inherently infringing.

True, and it seems that Mega's copyright infringement reporting page[1] has an option for that:

Takedown Type: Remove all underlying file(s) of the supplied URL(s) - there is no user who is permitted to store this under any circumstance worldwide

[1] https://mega.co.nz/#copyrightnotice

Hupo | 13 years ago | on: Mega has launched

>then serve it to Mega along with a DMCA takedown notice.

The thing with copyrighted content, though, is that even if the file you're checking might be infringing on copyrights in certain cases, in other cases it might as well be completely legit. I wrote about this on some earlier MU submission[1], so I won't repeat all that here, but all in all, even if you knew that file X existed on Mega's servers, it would be pretty damn haphazard to just outright delete it, because you might be hurting many legitimate users by doing so.

Anyway, I think Mega could secure user's files simply by encrypting the locator keys they have with the user's own key, and this data only gets decrypted and parsed client-side when the user uses Mega with the user's own key. This way you could only prove that a file exists on Mega's servers, but had no way to check which user(s) it belongs to without cracking the individual user data one by one. And of course, if you don't have any exact files to check against Mega, then you wouldn't be able to even figure out whether "content X" is hosted there somewhere, and neither could Mega (since they'd naturally only store locator hashes and encrypted data itself).

[1] http://news.ycombinator.com/item?id=4824986

Hupo | 13 years ago | on: HTML-based (mobile) weather app

Based on the other HTML-based weather app submission[1], I decided to submit something more accessible in the same category that I've been using a lot recently. Provided by the Finnish Meteorological Institute. My favorite part is that it includes a "feels like" temperature - especially useful during the winter around here!

[1] http://news.ycombinator.com/item?id=4987194

Hupo | 13 years ago | on: The Sorry State of Native App Typography Licensing

There's lots of great free-for-commercial-usage typefaces out there. If you don't already have Fontsquirrel[1] (who also has a fantastic @font-face generator[2]) bookmarked, you should fix that immediately. Google Web Fonts[3] is also another (obvious) resource.

[1] http://www.fontsquirrel.com/

[2] http://www.fontsquirrel.com/fontface/generator

[3] http://www.google.com/webfonts

Also, while I'm at it, I guess I could name some typefaces I've grown quite fond of: Alegreya, Aller, Cabin, Delicious, Fontin Sans, Lato, Open Sans, PT Sans, Puritan, Quattrocento Sans, Rosario, Source Sans Pro, Ubuntu. All great stuff!

page 1