cairo140 | 4 years ago | on: Ask HN: What else don't I know? (that the big tech companied have invented?)
cairo140's comments
cairo140 | 4 years ago | on: Rust Is the Future of JavaScript Infrastructure
To my eye, JavaScript has three applications:
* A GUI-building language: A language and ecosystem we use to build UIs, where code runs on the end user's machine, served over the Web or as desktop applications using a compatible API (e.g. Electron).
* A scripting language: An approachable high-level C-like language.
* A general purpose programming language: A server-side or developer machine-side programming language for computational tasks like linking dependencies, transpiling code, type checking, orchestrating tasks.
At the first it's unbeatable, and the ergonomics and ecosystem are phenomenal. This is why it "won" as a language to write even desktop apps like VSCode and Discord.
At the second, performance doesn't matter anyway, so making JS capable of this just makes it approachable for folks who learned the language for the GUI-building purpose.
It's at the third application that the fundamental limitations of JavaScript---the programming language---really show. These are things like the lack of true multithreading (technically possible with significant limitations around shared memory and messaging and high overhead with Web Workers), lack of low-level primitives to fine-tune performance, and the ease with which authors can accidentally write extremely slow code. So much of this (TypeScript compilation, NPM package resolution/linking, Webpack bundling) in the modern Web ecosystem is regrettably still done in mostly JS.
I hope the JS community rallies around one general-purpose language to offload these tasks to as well as a uniform shim layer like Ruby's C extensions so we don't end up with a chaotic mishmash of different technologies. If nothing else, it would be nice to have a good excuse to write not-JS every once in a while!
cairo140 | 4 years ago | on: Mozilla VPN Completes Independent Security Audit by Cure53
The problem is the ultimate consumer[1] of these reports, legal and procurement agents at buying companies, themselves don't care about the actual quality of the report except insofar as it satisfies their own transitive legal/sales requirements, and it's turtles all the way down. This harms users/customers at the end of the day because they don't have time to scrutinize the details of each individual report or have any real power. If we care about the end goal (secure, accessible software), we need for the auditing firms to collaborate we the government, judiciary, and ancillary vendors to tighten standards to include random[2], uniform checks (same auditor, same methodology, multiple vendors at once).
In the US, OSHA designates NRTLs like UL to perform safety testing, which are required everywhere from workplace standards to insurance requirements. In comparison, at least for accessibility, merely having your vendor have any assessment report is likely enough CYA to withstand a legal challenge. I acknowledge the power of recent website lawsuits to use the broader ADA to raise the bar here, but ADA's "enforcement through private lawsuits" enforcement mechanism is spotty and I think won't result in enough structural improvement.
[1] - These reports also serve as PR/marketing, which is probably moreso the case with Mozilla VPN, but in most enterprise software where these assessments are taking place, the marketing side is very much a secondary goal compared to the individual legal/sales relationship that hinges on the report.
[2] - I think removing the opportunity for vendors to fine-tune scope or prepare or respond to concerns (at least until the next review cycle) is a big step in the right direction, but unfortunately, the legal climate is very much all-or-nothing and not good at nuance. Section 508 (I'm not personally familiar with PCI/SOX/etc. but suspect those are similar) is formally speaking "all or nothing" check all the boxes things, and in that climate, good random audits will basically be always-failing, and if you make too hard a standard that even reasonable vendors can't meet with an earnest effort, you'll end up constricting the market into a meta-game of who can hack the auditing process. See federal government procurement.
cairo140 | 4 years ago | on: Netflix intensifies VPN ban and targets residential IP addresses
In the same vein, it can be rational behavior for a market leader who deals in private information (most online advertising companies) to advocate for consumer privacy protections. It "hurts" them, but if the resulting regulations are so onerous that only incumbent(s) can comply, it can restrict the competitive landscape and paradoxically be advantageous to the existing leaders.
cairo140 | 4 years ago | on: Leaving Google
* internalize the cost of core changes and don't push them downstream unless you have a really good reason * staff a core team to take care of high ROI horizontal efforts that have high fixed cost to do efficiently, e.g. TypeScript migration of a massive codebase * be careful of what you take on * complexity and code are costs and not end goals
Reading them is one thing but seeing the principles play out in detail amidst the mess of reality was illuminating for me. The author's blog[1] touches on these themes a bit more and I think are a good glimpse into their stewardship of the TS/JS codebase.
cairo140 | 4 years ago | on: Leaving Google
The success or failure with which any companies hires and retains folks like these is a mystery to me. As the evidence shows, Google (and I suspect almost any >100-person company) is far too much an amorphous glob/slime mold to even have a single coherent approach to this problem. But what are the successful strategies here? How important is it even to retain this kind of talent? How do we know how a company is even doing in this regard?
In my meager-in-comparison 6.5 years at Google, while I certainly saw many amazing SWEs come and many go, I'm far from knowing what the real trajectory was. But I can imagine that for many folks, the perceived trajectory would be a major motivating or demotivating factor to stay or to leave themselves.
cairo140 | 5 years ago | on: Airbnb’s Stunning IPO
Losers: Existing Airbnb shareholders who have shares in a company that could have raised $7B in cash but instead raised $3.5B. In theory, the mispricing "cost" Airbnb around 4% of its market cap, so existing common shares are worth 4% less than they should be.
But there's a caveat[1].
Companies like Airbnb (no stable profit) are almost entirely valued on sentiment and expectation. Assuming perfect information and rational actors, in an alternate universe, a direct listing would have resulted in around a $146[2] share price, on top of which the company can raise a secondary listing for $3.5B without affecting the share price. They'd have the same assets and liabilities as they have today, except pocketing the difference (by limiting dilution) instead of handing it to investors.
However, there's no certainty that the valuation would be $146 today had they gone this route. Humans, especially hype-powered retail investors who like to jump on flashy IPOs, are very susceptible to things like IPO bumps when making their buy or sell decisions. An entire field[3] of investing with significant buying power essentially relies on predicting sentiment.
If I'm a common shareholder, sure I'm sad that my shares are worth maybe 4% less than they "should" and that the financial world pocketed the difference, but I'm probably pretty happy my shares are worth what they are and aren't too interested in, for example, rehashing the deal or picking a offering strategy that could have swayed investor sentiment in a way that made my shares worth much more than 4% less.
[1] - The impact of "more perfectly" priced IPOs on market sentiment and thus public valuation is the most significant factor, but a secondary one is the practical aspect that the IPO process consists of a lot of risk, both on the part of the underwriting investment bank who gives the company the money raised in the IPO and the small number of powerful investors get access to (often) preferential pricing from the investment bank in exchange for an advanced commitment to buy the shares and hold for some agreed upon time. This illiquid market is dominated by relatively few actors. An IPOing company who doesn't "play ball" may find itself unable to find someone willing to underwrite their funding round or buy into it at all. [2] - This theoretical loss per share is computable two ways. The easy way is that they "left $3.5B on the table" which divided into Airbnb's current market cap of $84B is ~4%. The other way is the counterfactual way. Airbnb currently has 600M shares outstanding, having issued 50M shares in the IPO in exchange for $3.5B. So if not for the issuance, Airbnb is worth $80.5B with 550M shares outstanding, or $146 per share. [3] - https://en.wikipedia.org/wiki/Technical_analysis
cairo140 | 8 years ago | on: Silicon Valley Software Engineer Salaries by Experience Level
[1] - https://www.glassdoor.com/Salary/Google-Software-Engineer-II... [2] - http://h1bdata.info/index.php?em=google&job=software+enginee...
cairo140 | 8 years ago | on: Silicon Valley Software Engineer Salaries by Experience Level
cairo140 | 9 years ago | on: Dear Microsoft
The founder put his name on it https://mobile.twitter.com/stewart/status/793811616760496128
Agreed this is a move by Slack that is very disappointing to me. I thought better of them.
cairo140 | 9 years ago | on: In Investing, It’s When You Start and When You Finish (2011)
My concern is, that while this comparison is useful for people further along in their research trying to understand the volatility of the stock market, this chart has a number of misleading (IMO) traits that can dangerously/unfairly steer people who are newer to managing their own money away from index funds altogether.
I would hope that people see this chart, my comment, yours, and FabHK's excellent comparison to bond yields. But if you have limited attention and are getting started, I would hate for the original link to be the only thing you see.
Speaking for experience with family and friends, too many good people scared by charts like this bought gold in 2011 or trusted mutual fund managers to buy into funds with 4% front-end loads and 2% AUM fees.
cairo140 | 9 years ago | on: In Investing, It’s When You Start and When You Finish (2011)
I feel this graphic, while informative and delightful, is insidious in its choice of scale and its lack of comparisons.
On scale, it colors +3% to +7% real returns as "neutral". This makes it seem like the stock market is sometimes good sometimes bad but overall it may as well be just okay. I feel that 0% nominal returns, or even 0% real returns, is more honest as a neutral anchor, and even with the latter it would need some comparisons against other asset classes to paint an accurate picture.
On comparisons, it does a disservice to its readers by not adding a tab showing bond yields and a tab showing cash/treasury yields (which would be dark red across the board except light red around 1930).
These slights in the graphic unfairly make the stock market look unfavorable and makes the suboptimal strategy of keeping your money out of the market seem much more favorable than it is.
Adding a specific example, the "worst 20 years" is 1961 to 1981 with a BIG RED -2.0% a year, as if someone loses money and in retrospect would have been better stowing cash away under one's mattress. According to the BLS[1], US inflation was 5.7% over that period, so the mattress strategy yields at best -5.7% in real returns, not at all better than investing in the S&P.
[1] - http://data.bls.gov/cgi-bin/cpicalc.pl?cost1=1&year1=1961&ye...: $1 in 1961 was $3.04 in 1981, 3.04^(1/20) = 1.057
cairo140 | 9 years ago | on: Investing for Geeks
On scale, it makes it seem like +3% to +7% real returns is "neutral". This makes it seem like the stock market is sometimes good sometimes bad but overall it may as well be just okay.
On comparisons, it does a huge disservice by not adding a tab showing bond yields and a tab showing cash/treasury yields (which would be dark red across the board except light red around 1930).
I feel these slights make the graphic present stock investing in an unfairly unfavorable light and makes the suboptimal strategy of keeping your money out of the market seem much more favorable than it is.
cairo140 | 9 years ago | on: Ask HN: How much of your time is spent in meetings?
cairo140 | 9 years ago | on: Ask HN: I've invented / patented something powerful. Now what?
Your idea is good. Let's just say that.
What you have to do next is to differentiate yourself from the crazies. That means some of the following:
* Tech demo you can show someone in 30 seconds * A "shiny" frontend that the layperson can understand * A concrete value proposition, other than that it can be acquired
You don't want to be stuck in the following pattern, which makes you no different from the rest:
* Secrecy * Promises * Vagueness
cairo140 | 9 years ago | on: Google isn’t safe from Yahoo’s fate
Where do you hear this goal? Google Q2 Ads revenue was $21.5B. AWS Q2 revenue was $2.6B. Google is currently small compared to AWS.
cairo140 | 9 years ago
Most companies (including the one I was hiring for) cannot compete on price[1], so they only way they can hire a particularly desirable candidate is if they find a particular fit between candidate and company/role. A strong cover letter is a strong leading indicator of a potential fit, so it is very desirable.
[1] - I think the money is there and the economics probably work out to "just pay programmers more" but there are social factors that make this strategy unviable. In particular, most firms are very averse to paying programmers more than the non-technical staff to whom they report so the wage ceiling for programmers at a firm end up being capped to the market-clearing price for those business professionals.
cairo140 | 9 years ago | on: Ask HN: Given the cost of living, why would a developer live in Silicon Valley?
My rent went up by around $2000. At a marginal tax rate (Medicare + Federal + State) of ~40% this meant post tax income would have to go up by $40000 to "break even". Tack on an extra $5000 per year to cover the high state tax rate (this difference varies based on origin state). In comparison, my gross income went up by over $50k within a year and $150k within 3 years.
There are other aspects that add to "why I personally live here". (1) Job security is much better, not in the sense that companies are more loyal or successful here, but in that there is so much going on that you get job security from the strength of the market, and I feel like if I ever lost my job or wanted to change I can. (2) Because of the strength of the market, employers treat employees much better here than in smaller markets. (3) Everybody else is here. The proliferation of people who I can learn from here is amazing to take in.
There are definitely things that will tip the scale in favor of going away. Kids and buying a home are a big one, since those costs are even more outsized here than rent alone. Also, the valley is disproportionately friendly to the cutting-edge, type-A career, and the numbers (in terms of the jobs you can get) are not as friendly if you want to settle down. It's for these reasons it's likely I'll leave, but in the meantime, the cost of living is offset by a long shot, to say nothing about the intangible benefits.
cairo140 | 10 years ago | on: The New York Times reaches 1M digital only subscribers
However, nytimes.com has a very regrettable pricing structure, as you say. They seem to aim to be Comcast-like. The pricing is:
* Heavily promotional (special price 50%+ off sticker price)
* Difficult to cancel (you are asked to call them, although they will ultimately run it through email)
* Opaque in billing (no way to actually look at the terms of your current subscription/promotion; full payment history hidden away to obscure how much you're paying them)
They do this no doubt because it works in the short term and bumps up numbers. But as a consumer I can't help but feel they are losing goodwill and face in this new age and hope they come around to more contemporary online billing practices.
[1] http://www.nytimes.com/interactive/2015/09/21/business/media... is the best example I can find. It's very nice to get stuff like this every week. Kudos to the web team.
cairo140 | 10 years ago | on: Ask HN: Google recruiting – Asking for current compensation
In my case, I was already being paid well going into Google's offer process, but considerably below Google's band for the level I got approved for. I gave them my comp and they beat it by a large margin, to the point where they clearly weren't considering my existing comp. So I think if you just refused to answer, they'd give you the minimum in the band and wait for a counter or acceptance.
Two further points that I think are relevant for a minority of candidates:
For folks who are L3/L4 or L4/L5 marginal who may get bumped up a level based on existing comp, if anything you'd rather be underleveled to start. There's so much to learn at Google and so many of the folks around me are so damned talented that I'd feel like even more of an impostor if my comp history brought me up to a level I was not at. Some anecdotes from Quora are at [1].
Secondly, there is some lore about high compensation plans (HCPs) at Google. If you believe you are paid well towards the higher end or above what Google pays for your band, you can Google around for this to get some stories of what it's about. This didn't apply to me and I don't know of anyone in this situation so I can't speak to it.
[1] https://www.quora.com/How-do-I-negotiate-a-higher-job-level-...
The key idea is that any human-written change only ever touches ~1-100 files, and there's never any need to store or maintain the entire source tree in your local persistence store. By only ever working on deltas, you can have highly efficient distributed and cached builds. This architecture imposes significant constraints: you must express your build graph in a modular way [2], you collocate your distributed FUSE backend and build system for speed, and all development is over the network. But it comes with many benefits at scale: near-perfect caching, instant "branch switching", fast builds even with deep and wide dep graphs, and excellent affected target determination so flakes/outages in one part of the codebase don't affect the rest of it.
[1] - https://dl.acm.org/doi/pdf/10.1145/2854146 [2] - https://bazel.build/