_where's comments

_where | 5 years ago | on: Stealth – Secure, Peer-to-Peer, Private and Automateable Web Browser

SSL alone is not enough to protect data.

Any proxy could act as a MIM, so someone using a malicious fork of Stealth may cause problems.

But, the net is like this already. One site may send you to another site that tricks you into stealing your data. And, a relatively recent vulnerability subverted any WebKit-based browser from stating whether the site’s URL was using the correct server, so you’d have no visible way of knowing a site using HTTPS was legitimate.

Using a VPN could be better, but it’s sometimes worse, because you change who is trusted more (the VPN provider), as they know one of the addresses you’re coming from and everything you’re doing, and can record and sell that data.

_where | 5 years ago | on: Stealth – Secure, Peer-to-Peer, Private and Automateable Web Browser

If the network isn’t free and data is centralized, one day you think you have it all and the next you could have nothing. Tor pretends to be secure, but is dark and compromised. This project seems to understand that and wants to try again to fix via P2P in a way that has promise.

The simple implementation of web forms is broken in today’s web. It’s an input field or other element styled as an input field that may or may not be grouped or in a form object, possibly generated dynamically. Websockets, timers, validations, ... it’s a huge PITA.

The DOM is a freaking mess. It’s not there until it’s there, it’s detached, it’s shared. It’s been gangbanged so much, there’s no clear parent anymore.

ECMAScript- which version and which interpretation should Babel translate for you, and would you like obscufation via webpack, and how about some source maps with that, so you can unobscufate to debug it? Yarn, requireJS, npm, and you need an end script tag, should it go in the body or the head? You know the page isn’t full loaded yet, and it won’t ever be. There, it’s done, until that timer goes off. Each element generated was just regenerated and the old ones are hidden, but the new ones have the same script with different references. Sorry, that was the old framework, use this one, it’s newer and this blog or survey says more people use it.

For a P2P open data sharing network over https, the proxy could allow a request to get someone else down the path. Not everything is direct.

_where | 5 years ago | on: Atlassian moving to cloud-only, will stop selling server licenses

Calling it “moving to the cloud” has such pompous overtones. They stopped giving away access to their source for free, just like the majority of the rest of their community. Atlassian might as well be any other company now with a really good product suite. Years ago, they were awesome. They gave free licenses to open source projects. I still think they’re the best, but this cloud-only thing sucks. The software has value outside of their hosting it.

_where | 5 years ago | on: Re-Thinking the Desktop OS

Definitely. Windows got worse after XP, mostly with 8. Why did they move things around and setup multiple ways of doing things? Gnome got worse after v2, and what was Ubuntu thinking with ads? OSX/macOS/iOS got worse with flat design (which thankfully morphed) and AppStore and awful windows-style install nanny.

It’s not just the interface. iOS and macOS got embedded spyware years ago, and it’s still there; they can backdoor whenever. Dig through your logs and sometimes you’ll see output of a menu choice upon being connected to. Windows has similar from what I’ve read. I’m willing to give up some privacy, but it seems like B.S. to make people pay for things that do that in such a hidden way. It leaves things exposed. Unfortunately, the desktop OS alone is not enough for security either. Hardware doesn’t lock down, with potential openings for instructions in multiple places in modern computers.

_where | 5 years ago | on: Re-Thinking the Desktop OS

Security is hardware level first, at multiple levels. So, I don’t think we’ll get it back, at least until eternal love beats evil.

_where | 5 years ago | on: Re-Thinking the Desktop OS

I want old pre-AppStore OSX/macOS that doesn’t nanny my installs or screw old drivers, with a good package manager, and easily tabbed and gridded terminal windows without tmux necessarily.

I’d also like GPL’d Windows XP running flawlessly like a mac.

I’d use Linux on the desktop, but I’ve never liked any of the desktop managers and it was never as reliable as OSX/macOS.

_where | 5 years ago | on: I found black-hat content marketers: sockpuppet bloggers, fake Reddit/HN accts

I’m with you that it’s a serious problem. If it weren’t, Amazon and others wouldn’t be working so hard on AI to combat the AI or human that’s beaten their AI.

Those aren’t “fake accounts”, though. They’re real accounts being abused. There’s a difference. If Amazon and Twitter allow it to happen, it will happen. But what does shaming accomplish here? It just means people waste time talking about it. It has little chance to change behavior. More likely the outcome could become Reddit and HN enforcing a real ID. That may hurt the community, because not all of us want our name on everything; it’s not because I don’t stand behind what I’m saying- I’m just not going to treat every post like I want to carry it around with me on a sign for the rest of my life, even though at some point, maybe I’ll have to!

_where | 5 years ago | on: I found black-hat content marketers: sockpuppet bloggers, fake Reddit/HN accts

There are no fake accounts. There are only accounts or not.

There are no sockpuppet bloggers; those are called bloggers.

If you go down the fake road, you have to realize real news by real reporters is often wrong. I’ve had someone in law enforcement tell me the news accidentally labeled them the victim on televised news and didn’t correct it.

Classbooks written for US public school students- I understand that some content in them has been incorrect and intentionally biased.

There is truth, but it’s an ideal.

I’m glad that this is calling out those that are manipulating people, but on the other hand- what is the goal?

Will shaming bring fairness?

We could have communist dictatorial leaders enforcing their version of truth, if you’d rather have that sort of thing.

Our president should tell the truth, and it should be a scandal if not, to a point of course, because I’d bet most have lied at times.

But if it’s time to activate something like a libel superpower on the internet, how would that even work in a fair and practical way?

Freedom of speech cannot be freedom only to tell truth; truth can be aspired to, but not necessarily known by all, and what’s understood to be truth by some may change. So, really, what should be done?

Btw- I’ve done my best in past years to tell the truth as much as I can when I’m not kidding around, and it typically makes things difficult, but better. I’m not recommending anyone fake up things to boost rep. But, it’s happening, it’s not good, and I don’t see how AI or oversight or a control play would end well when it comes to enforcing truth. However, the notion of a “fake” account is what allows most of the users to post content on HN and Reddit more freely.

_where | 5 years ago | on: Bash Error Handling

If it’s not by default there’s a reason. Bash is literally running commands in a shell session. Think terminal session. When a command fails, would you want the terminal session to end? That’d be annoying.

Same theory for unset variable. Referencing an undefined variable shouldn’t break your session. Why initialize it anyway? It’s more code to change if you don’t use it if you have to initialize it when it might not be needed. And, you’d have to call the script with A= just to check A wasn’t defined, and in the process now you have A assigned to an empty string, instead of only defaulting to one when called, which uses more memory and execution time.

The pipeline doesn’t die because && and || and parens are seriously helpful for one-liners.

Don’t think of it as a script. Think of it as a script for a shell.

_where | 5 years ago | on: What Working At Stripe Has Been Like

This is a great read, but for the feedback, I stopped here: “I cannot say this enough: pick your peer group wisely because you’re giving them write access to both your conscious thoughts and your entire worldview.”

I’ve heard this, but here there’s an ulterior motive, which may not benefit. Stripe benefits from startups, and if you hangout with successful professionals and mentors and people that don’t bring you down, that will probably make the startup community successful and that in turn will help Stripe grow.

But would you ignore your family member that needs your time and attention? How about your friend that seems to be tanking? If Warren Buffett were to have serious health problems, I seriously doubt Bill Gates would be like “whatever- I don’t have time for that.” He’s going to be there. Because that’s what you do. Sure, there are times when you need your space, and friends change, or tend to abuse. Maybe you need to withdraw some, get off of social media. But, being a fair weather friend or family member just because someone else is not on their A game or never was is B.S.

I’m not discounting the advice; if you’re only goal is to have what is generally perceived to be worldly success, having only peers that boost you is a way to do it. But the greatest person that ever lived had a bad friend that got him killed. He wasn’t like, “sorry, I have dinner plans.”

_where | 5 years ago | on: Why Life Can’t Be Simpler

Examples:

Would it make sense to have more fields visible at once? Maybe making users not have to dig though 10 pages to find the “Covid” checkbox would be better.

Would it make sense to have a single schema monolith with simple HTML forms vs microservices, 100 schemas, persistent connections, streaming, and burn-rate-heavy cloud infrastructure?

Do you need all of those JS libraries that look simple but are actually more complex than vanilla JS to setup and maintain?

Object storage seemed simple, but how will you translate all of that JSON later when the model must change over and over again?

Is that code that uses lambdas and annotation-based cloud logging that is so much more readable and maintainable generating frequent delays for customers due to GC and latency than the loop and shared variable equivalent that writes to file?

Counter-examples: code that could be simplified without cost, storing customer data whose structure is out of your control, use of well-maintained libraries that save time and add value, and relatively low cost use of the cloud services due to good planning and design.

_where | 5 years ago | on: Honda quits F1, invests in carbon-free tech instead

> When the sport was awash with tobacco money about 15 years ago, the big teams all had an entire testing team

So, we just need companies with a lot of money that are interested in technical advances and advertising and don’t care about RoI like banks might? What about Google, Apple, Microsoft, Oracle, Salesforce, ... any takers?

_where | 5 years ago | on: Three Ways of DevOps

The post’s argument is illogical.

It doesn’t follow from wanting to speed things up for other teams that you can avoid some kinds of testing in the pipeline by having monitoring. That makes no sense.

It also says a full test suite in the pipeline is not needed because there should be monitoring in production.

Monitoring doesn’t catch all bugs, and letting things fail at a later stage isn’t the right path for a CMM level 5 application that lives and/or serious investment depend on.

Granted, while application monitoring puts additional ongoing load on the application, which slows it down, going against the desired goal of speed, it can help to have application monitoring that may identify slow, insecure, or error prone queries. However, that shouldn’t have to run constantly unless it has low overhead. Also, if problems are just reported after deployment and aren’t a gate to deployment, then you’ll keep pushing crap code into production.

So, what I’m reading here is that Dev Ops, as defined, values getting things through the pipeline quickly, so that when bugs, security, and performance problems are found, those aren’t a Dev Ops problem, and those production issues can be handled by developers, because they’ll have fast feedback from monitoring that maybe they didn’t kill someone yet?

_where | 5 years ago | on: Why I left my tenured academic job

The ugly at universities is much worse than what was posted.

Athletes getting easy courses. People in power positions typically don’t deserve it but think they do. So, there are ridiculous projects, bias, and people getting paid in “non-standard” ways. Backroom deals with private industry and government. Things that would make no one want to go there.

Universities should work more with private industry like they used to many years ago, government money should be poured into research more openly, and tuition needs to rise, then they can hire the best from private industry that are also excellent teachers or researchers.

To do that, administrations and staff will need to be gutted.

page 1