knappador | 11 years ago | on: Treeline (YC W15) Wants to Take the Coding Out of Building a Back End
knappador's comments
knappador | 11 years ago | on: Treeline (YC W15) Wants to Take the Coding Out of Building a Back End
In short, they're integrating some commonly used API's and backend programs (Elastic Search etc) and putting a graphical programming model on top of it.
Crappy backends are pretty easy to throw up on a VPS or Heroku these days. It's so easy that I'm likely to select an app server(s) and language based on what library (if any) is most critical to that particular request type.
At first I wasn't sure who the target customer could possibly be. Was it a PaaS service? BackendaaS? A service like this lives in a weird space. It's almost only good for early product dev and when the team's backend devs are green.
Things like having unit tests, scale, completely custom capability etc will drive everyone who makes a successful product out at the very time that they can start to afford paying a lot.
Don't burn all your money yet ;-D
knappador | 11 years ago | on: Vulkan – Graphics and computing belong together
With Metal and Mantle, it was clear that graphics programmers were wanting to use GPU's at lower levels of abstraction to more efficiently utilize the design of GPU's, which have a much different architecture than a decade ago. Without a corresponding option that is standards compliant, low-level API's are threatening to fragment GPU programming.
One operation I didn't see any mention of that seems to have some future role, albeit completely uncertain, is the Heterogeneous System Architecture Foundation spear-headed by AMD and having seemingly every chip-maker except Nvidia and Intel on-board.
As GPU stream processors get more CPU-like but are natural at parallel processing, it is a matter of time before we get the right abstractions so that map gets scheduled across many stream processors and is reduced by a higher-single-thread performance CPU and suddenly computer vision and many other naturally parallel data synthesis workloads are programmed in a less heterogeneous software environment and executed on chips where the stream processors and CPU's share a large amount of commonalities, possibly down to micro-op compatibility through something Nvidia could be pushing towards in their Denver architecture (as yet highly speculative).
To somewhat less than enthusiastic coverage, Nvidia has been building up their partnerships with automakers like crazy. Computer vision is obviously one of the applications that will be required in self-driving cars. Tango and other projects are also quiet beneficiaries of Nvidia tech maturing into the Tegra platform.
We have far from conquered programming and CPU design just because JIT's are good, 8GB of RAM is expected or GPU's can mine MHashes/s etc. We're in some future's bad-old-days. The idea that Khronos is involved in the unification of graphics and computing API's as well only makes the exciting question of who will drive our cars more intriguing.
knappador | 11 years ago | on: Isomorphic JavaScript is not the Answer
knappador | 11 years ago | on: Breach – A new modular browser
Unlike Kivy, which is Python, you've chosen JS, so at least your UI is on a more naturally portable layer given that you have a browser runtime right there. Documenting your plumbing into the system libraries will be really critical. The JS community will hack every other piece of the system on their own, but user after user will shy away from the system plumbing because it's not what attracted them initially, so I would prioritize there.
Writing cross-platform UI's may be a pipe dream for now. OpenGL is headed towards a weird place with Mantle and Metal etc while OpenGL Next comes together. The HSA Alliance (Notably missing Nvidia and Intel) might offer some hope that CPU architectures and graphics API's (and GPGPU) will converge again, but right now, HTML5 with JS is probably the best we have for sheer everywhere.
Probably where I would diverge most sharply (and somewhat irreconcilably) with the web community and perhaps even the high-minded ideals of Mozilla is HTML/CSS, but the "superior" tools I find in Android are themselves relatively new. Do I wish that HTML/CSS/JS would go away? Yes, but without need of burial. A technology will displace it someday through sheer elegance that unicorns had not yet fathomed. If it's at all recognizable in what I'm using today, it's an early stage experiment.
knappador | 11 years ago | on: Breach – A new modular browser
knappador | 11 years ago | on: Purism Aims to Build a Philosophically Pure Laptop
As for this particular case, SSD > liberty for me. I'd rather boot at a reasonable speed than die waiting to copy files.
knappador | 11 years ago | on: The Rise and Fall of the Lone Game Developer
knappador | 11 years ago | on: Meditation Is Garbage Collection for Your Mind
knappador | 11 years ago | on: Fasting triggers stem cell regeneration of damaged, old immune system (2014)
knappador | 11 years ago | on: Fasting triggers stem cell regeneration of damaged, old immune system (2014)
I've done fasts on relatively nothing for over five days. Early on, you get some "empty-stomach intelligence" where your mind is just more alert because of plenty of oxygen (from not digesting) and probably some evolution that favored apes that got observant and driven when hungry as opposed to lazy.
Fasting long enough to get the effects in the paper will use up all your glycogen stocks. Read wiki on starvation for details, but basically you want to have a small supply of starches to keep your skeletal muscle from breaking down to feed your brain. About two days in, without some glucose supply, you will unavoidably slow down. Lethargy will commence when your brain is starving. It is not euphoric. It is the world slipping out of grip without you noticing.
As long as your taking care of your mandatory glucose supply, fasting over three days or more will lead to a common theme in all nutrition and health papers: all processes that break things down for re-use are up-regulated and all processes that consume resources are down-regulated.
It's not difficult to propose that a system composed of mostly brand-new, turned over cells and muscles with all their proteins aligned to the principle stresses is more protein efficient than one resulting from plenty of nutrients floating around like entrepreneurs with too much money. Fasting would be expected to result in general house-cleaning so that protein that's not accomplishing much ends up reallocated to doing something useful.
Some examples of where this shows up in scientific papers is the increased sensitivity to chemotherapy of cancer cells when the patient is fasting. One proposed mechanism was an up-regulation of registration for immune-system induced self-destruction through the normal pathway; a badly functioning protein might go ahead and function sufficiently in the up-regulated pathway.
More obvious benefits from fasting are the same as those of anything where you build up self-discipline and also getting in touch with the difference between hunger and a lack of being full; they are much more distinct after a fast. Cravings for sugar are gone because your hormones for hunger and satiety are all operating properly, as is your pancreas.
My favorite way to get empty is to just drink water and/or coffee (black) or tea. Fluids, no macro-nutrients. Get some glucose from some bread. Get some micro-nutrients from some yogurt and rabbit food. Let the rest slip into the starvation mode. Stress level and blood pressure will decline into a kind of euphoria as long as you don't over-do the calorie starving to the point that you're thinking on muscle protein.
I've done it. All the mechanisms have obvious symptoms. If you are just ketogenic and in starvation, you feel euphoric and completely without variations in energy. If you are beyond starvation and skeletal protein is being converted to glucose, you piss dark yellow and feel like you're on anesthetic. Either way you lose a lot of weight, your energy will regulate much more tightly, and you won't be bothered by hunger, and you'll be drinking coffee as nature intended it instead of some holiday special milk-shake.
It's well-established that reduced calorie diets promote longevity and that the spectrum of metabolic syndrome conditions are associated with all sorts of bad things, so although I'm not a doctor, I don't expect to have any regret about recommending anyone try coffee fasting to put their system through its paces.
knappador | 11 years ago | on: Ai Weiwei Is Living in Our Future
knappador | 11 years ago | on: Ai Weiwei Is Living in Our Future
In ten years I'll be sharing and leaking at least 10x as much information out of all kinds of devices. 90% of my engagement will happen inside programs will be automatically syncing data across cloud services. Security is inevitably a growing target in the networked world, and privacy requires security. Increasingly for the sake of productivity and collaboration, everything I use will be sharing and syncing more and more. The desire of most people to be connected and productive, not some autocratic slide in the worlds governments will be the death of privacy.
One of the features of Facebook I liked in the early days was just the slight exclusiveness that made it basically okay to talk about having a giant hangover without fear of looking like an alcoholic to an interviewer scratching up dirt (I think zero interviews I care about do this). When Facebook started making the defaults public etc without notifications, there was understandably some uproar about being unknowingly thrust out into the public. Eroding privacy causing blunders is not the same thing as not having privacy. For the most part anxiety about not being able to control your privacy or security really need to be analyzed in the context of just how hidden you really need to make your words (in the case of anti-dissent government) or actions (in the case of socially conservative laws) to be able to practice or advocate what it is you care about. To an almost absolute degree, very little of things you want to see in the world that don't exist yet are going to require going to war for the cause of security or privacy before your end goals can be pursued, so it's just not really worth it.
EFF does great work to defend people against stupid laws and to promote better laws with regard to IP etc. They protect anonymity for regular people that happen to cosplay and have very odd taste in character appropriateness. However, the area where the EFF pulled really hard to ensure that the future of the internet would be egalitarian in the United States was about Title II common carrier law, not about privacy. The 1st amendment protects what you say, not some mythical right to say it without consequences; you only get that when nobody cares about your opinion. Even in the case of socially conservative laws, stand up for respect for individual beliefs before you stand up for privacy as an extra-social cure.
Privacy is not close to as fundamentally important as the security that is required to achieve it, and privacy advocacy is to an extent like whining about how someone took your tree-house and you can't have any secret clubs anymore. Most privacy is not used to do productive things, and few productive things outside of already entrenched, authoritarian governments require privacy to pursue.
knappador | 11 years ago | on: The Future of Space Launch Is Near
Including the rest of the dry mass fraction of the vehicle, (the tank, the structure to hold the tank etc), each Merlin's dry thrust-to-weight ratio drops to 43.05.
(654kN * 1000 N / kN / 9.8 (kg * m/s2)) / (Inert mass / 9.0).
The GE F414 in the high-thrust configuration is the dry-mass comparison using COTS. Let's say it's 8.0 when factoring in some control system and inlets.
Isp of a jet engine burning liquid methane is ~2000 even in full reheat. Isp of rocket engine is ~330 for methane-lox in the newer full-feed Merlin. The impulse to fuel weight ratio is almost 10x inverse in favor of the air-breathing system.
The dry mass thrust-to-weight is decisively in favor of the rocket engine. The wet mass thrust-to-weight is decisively in favor of the jet engine. Affect of these factors on cost is inconclusive from the napkin.
How can air-breathing engines potentially lower the TCO?
46 reusable RS-25 were flown on 135 shuttle missions, so the average lifetime of each engine was...8. The lifetime of a gas turbine is measured in thousands of hours, so let's say 200 missions and without so many major overhauls. Even if Space X is pulling out some cool voodoo for Merlin reuse-ability, gas turbines are in a different league in terms of both reliability and lifetime.
While a 1st stage rocket body has to re-enter, land, and be transported back to the launch site, an air-breathing 0.5 stage can fly itself back from its short distance downrange, thereby incurring zero transport cost back to the launch site.
An extreme application of air-breathing might include ram-jet inlets that opened after mach ~1.8 and took over air-breathing duties up to Mach 4.0 as the 414's idled. Even in that case, the whole system is not that far downrange and can use altitude to make up quite a bit of the journey back and use some vertical landing system if it saves weight. The fuel efficiency lets you do this with 414's in spite of their lower T:W.
Does using air-breathing assist pay off by making a lighter 1st stage or heat-shielding the 1st stage to make a faster release point? Only extensive analysis of the various options can answer these questions. I believe the mechanisms are there, and that's what justifies the design research.
knappador | 11 years ago | on: The Future of Space Launch Is Near
"The trade-off with upgrading the engine to produce 26,400lb-thrust is a considerable hike in maintenance costs. Running the F414 EDE at the higher thrust setting reduces turbine life to 2,000h, Caplan says. This is just one-third of the current 6,000h interval."
"GE also has the option of switching to a more heat-resistant material to make the blades. The company is considering designing the first-stage turbine blades of the GE9X with silicon carbide-based ceramic matrix composites (CMC), which are lighter and more heat-resistant than metal. Indeed, the company tested CMCs in the low-pressure turbine of the F414-GE-400 for a 2011 demonstration."
Turbines designed for rocket assist need to live <100hrs to service 200 launches. GE has basically already said they can push the engine to > 11:1 T:W, not including the advantages of ceramics or LNG combustors, which they also know how to make plenty of for utility gas turbines.
knappador | 11 years ago | on: The Future of Space Launch Is Near
A smaller 1st stage fuel tank might mean that heat-shielding or higher temp materials is more of an option, which means you can raise the 1st stage separation velocity closer to what is "optimum" in the rocket sense and still recover the 1st stage. This puts the 2nd stage closer to its natural optimum design point.
Overall size might not be a show-stopping problem now, but for building space colonies or Mars missions, overall vehicle size does eventually become concerning. Think of trying to build a jig to friction-stir weld a 180m long, 14m wide fuel tank. It's cool but also a bit of a distraction to have to build and handle everything at unusual scale.
It would be super cool to see big packs of ten F414's at the base separate off, curve back towards the launch site, flip out some wings, and then cobra into a vertical landing. GE should step in right now to have the perfect platform to develop ceramic parts without the same service life requirements as aviation gas turbines.
knappador | 11 years ago | on: The Future of Space Launch Is Near
knappador | 11 years ago | on: The Future of Space Launch Is Near
More modern F119 has a T:W of about 8. With ceramics and pure focus on re-heat and specific thrust, you can get over T:W of 10 likely. There may be other tricks on fuels besides jet fuel that might even allow more focus on re-heat, making the T:W even higher without necessarily affecting fuel consumption.
Going COTS with F119 is interesting, but ceramics might make some game-changers. Ceramic turbine blades and afterburner pipes will drastically lower the weight vs Nickel superalloy parts while lower cost and manufacturing complexity. Turbine blades are really impressive, but it's all cost.
knappador | 11 years ago | on: The Future of Space Launch Is Near
Max q is what makes it seem right. An STS flight profile can't even use max thrust again until 50k ft and Mach 2. Turbojets peeling away and glide back from around 60k ft and Mach 2.5 seems right. This is easier than a sea recovery, and because the burn duration is so short, it almost doesn't make sense to do anything but go full reheat and target the highest thrust in the lightest package. As more ceramic matrix composites replace Nickel, I would expect the T:W to get over 10, so it's really about what the smallest airframe is that you can wrap around a gas turbine and still land it on its own.
The airframe size of the 0.5 stage (1st stage?) is the biggest departure from normal. You can get ten F119's around the base of an F9 without thinking too hard, but that's only ~400k lb thrust for 40k lb of engine and almost doubling the diameter at the base. Could get weird.
Ceramic composites will be pretty critical. If you can replace Nickel parts with ceramic, you can get rid of bleed air and raise the temperature, which is what you need when you're sacrificing a little extra NOx for vastly higher T:W.
knappador | 11 years ago | on: The Future of Space Launch Is Near
How would you position your product relative to Heroku in terms of power and how much schlep you trade for how much ease-of-use? I think it's very okay to be clear for now in order to convert well with your core customers.