rebootthesystem's comments

rebootthesystem | 7 years ago | on: Why a Typical Home Solar Setup Does Not Work with the Grid Down

Looks like I great article. I'll have to defer reading it. During a quick scan I saw a mention of the SMA technology that allows for dedicated outlets to be powered-up in case of grid failure.

I built a 13 kW ground-mount system feeding a pair of SMA inverters. I have tested this feature by disconnecting from the grid and enabling the outlets (one per inverter). I didn't quite get to the 2,000 W rating SMA claims but got close. Which means that with this size of an array and two inverters I get somewhere between 3kW and 4kW of power to run various devices while the sun is up.

Considering that we might have a couple of power outages a year on average (if that), I felt this was a reasonable investment. Going with batteries is just too expensive and not justifiable at all given the reliability of the grid. One way to think about this is that the grid is your battery. A stretch, I know.

Funny that there's a picture of a gasoline generator towards the end of the article. My guess is that I am likely to invest in a 5 kW to 6 kW generator before I ever add batteries to this system. Again, it's a matter of ROI. Also, I would not go with a gasoline powered generator at all. Gasoline degrades with time and could be a nightmare to maintain the system with sporadic use. I think a propane fueled generator might be a better idea. The fuel does not degrade. So long as you don't have leaks it'll be there ready to go when you need it.

I know way too many people who have been mercilessly duped by these solar companies who come in, hook them on some kind of a lease, install inadequate systems and move on to the next victim. Lack of understanding on the side of consumers has created a situation where solar is equivalent to magic and unscrupulous actors can take advantage of them. That part is sad.

rebootthesystem | 7 years ago | on: Conflating pointers with arrays: C's biggest mistake? (2009)

Nice try at a weak ad hominem.

Your parent comment is utterly irrelevant. The conversation is about the C language and the perception some seem to have that it has problems. My only argument here is that a capable software engineer knows the language and tools he or she uses and has no such problems, particularly with a language as simple as C. Things like pointer "surprises" are 100% pilot error, not a deficiency of the language itself.

rebootthesystem | 7 years ago | on: Conflating pointers with arrays: C's biggest mistake? (2009)

Nowhere did I say that modern languages don't have their place and advantages. I use them all the time. In fact, I prefer them when they make sense for precisely the reasons you point out.

You might be reading far more into my comments than what they were intended to address. Namely that blaming languages for the failings of software engineers is dishonest. A true software engineer will know the chosen tools and languages and use them appropriately. Blaming C for pointer issues is dishonest and misguided. There's nothing wrong with the language if used correctly.

rebootthesystem | 7 years ago | on: Conflating pointers with arrays: C's biggest mistake? (2009)

That's a great story, thanks!

I once worked with on a project that needed specialized timing in relation to high speed (well, 38.4k) RS422 communications. I don't remember all of the details, it's been decades. I remember one of the engineers came up with a super clever way to trigger the time measurement and actually measure it. Rather than using a UART he bit-banged the communications and actually used the serial stream for timing (meaning the one's and zero's). It worked amazingly well. If I remember correctly that was a Z80 processor with limited resources.

rebootthesystem | 7 years ago | on: Conflating pointers with arrays: C's biggest mistake? (2009)

So you claim an array actually exists in a computer?

OK. Prove it. And you have to do it without laying out a set of rules and conventions that might allow us to interpret a list of bytes as an array.

An array is a fabrication by convention. At the simplest level it is a list of numbers in memory. Adding complexity you can store additional numbers that indicate type size and shape. Adding yet more complexity you can extend that to be lists of memory addresses to other lists of numbers, thereby supporting the concept of each array element storing more than just a byte or a word. And, yet another layer removed you can create a pile of subroutines that allow you to do a bunch of standard stuff with these data structures (sort, print, search, add, subtract, trim, reshape, etc.).

Nowhere in this description does an array exist. There were experimental architectures ages ago that actually defined the concept of arrays in hardware and attempted to build array processors. These lost out to simpler machines where multidimensional arrays could be represented and utilized via convention and software.

Arrays do not exist. If you land in the middle of a bunch of memory and read the data at that location without having access to the conventions used for that processor or language nothing whatsoever tells you that byte or word is part of an n-dimensional array. The best you can say is "The number at location 1234 is 23". No clue about what that might mean at all.

rebootthesystem | 7 years ago | on: Conflating pointers with arrays: C's biggest mistake? (2009)

> BTW, even physical machines have undefined behavior, when values exceed the specs and there's no telling what might happen

And if you (plural) are an ENGINEER, it is your JOB to KNOW these things and prevent them from happening.

I get the sense that the term "software engineer" has been extended so far that we grant it to absolute hacks who know nothing about what they are doing and what their responsibilities might be. Blaming a language, compiler and machine are perfect examples of this.

True engineering isn't about HOPING things will work. It is about KNOWING things will work. And testing to ensure success.

I've been involved in aerospace for quite some time. People can die. This isn't a game. And it requires real engineering not "oh, shit!" engineering that finds problems by pure chance. Sadly, though, we are not perfect and things do happen. It isn't for lack of trying though.

rebootthesystem | 7 years ago | on: Conflating pointers with arrays: C's biggest mistake? (2009)

> all you have to do is change some code generation option on the compiler command line and millions of lines of code now produce different instructions.

It is the responsibility of a capable software engineer to KNOW these things and NOT break code in this manner.

You are trying to blame compilers and languages for the failure of modern software engineers to truly understand what they are doing and the machine they are doing it on.

If you truly understand the chosen language, the compiler, the machine and take the time to plan, guess what happens? You write excellent code that has few, if any bugs, and everyone walks away happy.

And you sure as heck are not confused or challenged in any way by pointers. I mean, for Picard's sake, they are just memory addresses. I'll never understand why people get wrapped around an axle with the concept.

I wonder, when people program in, say Python, do they take the time to know --and I mean really know-- how various data types are stored, represented and managed in memory? My guess is that 99.999% of Python programmers have no clue. And I might be short by a few zeros.

We've reached a moment in software engineering were people call themselves "software engineers" and yet have no clue what the very technologies they are using might be doing under the hood. And then, when things go wrong, they blame the language, the compiler, the platform and the phase of the moon. They never stop to think that it is their professional duty to KNOW these things and KNOW how to use the tools correctly in the context of the hardware they might be addressing.

I've also been working with programmable logic and FPGA's, well, ever since the stuff was invented. Hardware is far less forgiving than software --and costly. It forces one to be far more aware of, quite literally, what ever single bit is doing and how it is being handled. One has to understand what the funny words one types translate into at the hardware level. You have to think hardware as you type what looks like software. You see flip-flops and shift registers in your statements.

This is very much the way a skilled software developer used to function before people started to pull farther and farther away from the machine. It is undeniable that today's software is bloated and slow. Horribly so. And 100% of that is because we've gotten lazy. Not more productive, lazy.

rebootthesystem | 7 years ago | on: Conflating pointers with arrays: C's biggest mistake? (2009)

You just proved my point. A programmer who truly knows (a) the machine they are working with and (b) the language they are using will know exactly how to use both in order to deliver intended results.

For example, reading the processor data book to understand it, the instruction set and how it works could be crucially important in certain contexts. I would not expect someone doing Javascript to do this but how many have studied the virtual machine in depth?

Don't confuse being lazy with problems with languages and compilers.

rebootthesystem | 7 years ago | on: Conflating pointers with arrays: C's biggest mistake? (2009)

This is a typical misinterpretation of the reality of programming. There is no such thing as undefined behavior. Once you get down to bits and bytes in memory and instructions the processor does EXACTLY what it is designed to do and told to do by the programmer.

Despite what many might believe the universe didn't come to a halt when all we had was C and other "primitive" languages. The world ran and runs on massive amounts of code written in C. And any issues were due to programmers, not the language.

In the end it all reduces down to data and code in memory. It doesn't matter what language it is created with. Languages that are closer to the metal require the programmer to be highly skilled and also carefully plan and understand the code down to the machine level.

Higher level languages --say, APL, which I used professionally for about ten years-- disconnect you from all of that. They pad the heck out of data structures and use costly (time and space) code to access these data structures.

Object oriented languages add yet another layer of code on top of it all.

In the end a programmer can do absolutely everything done with advanced OO languages in assembler, or more conveniently, C. The cost is in the initial planning and the fact that a much more knowledgeable and skilled programmer is required in order to get close to the machine.

As an example, someone who thinks of the machine as something that can evaluate list comprehensions in Python and use OO to access data elements has no clue whatsoever about what and how might be happening at the memory level with their creations. Hence code bloat and slow code.

I am not, even for a second, proposing that the world must switch to pure C. There is justification for being lazy and using languages that operate at a much higher level of abstraction. Like I said above, I used APL for about ten years and it was fantastic.

My point is that blaming C for a lack of understanding or awareness of what happens at low levels isn't very honest at all. The processor does exactly what you, the programmer, tell it do to. Save failures (whether by design or such things as radiation triggered) I don't know of any processor that creatively misinterprets or modifies instructions loaded from memory, instructions put there by a programmer through one method or another.

Stop blaming languages and become better software developers.

rebootthesystem | 7 years ago | on: Conflating pointers with arrays: C's biggest mistake? (2009)

My guess is this won't be a popular post given the average age of HN participants.

There's nothing whatsoever wrong with C. The problem are programmers who grew up completely and utterly disconnected from the machine.

I am from that generation that actually did useful things with machine language. I said "machine language" not "assembler". Yes, I am one of those guys who actually programmed IMSAI era machines using toggle switches. Thankfully not for long.

There is no such thing as an "array". That's a human construct. All you have is some registers and a pile of memory with addresses to go store and retrieve things from it. That's it. That is the entire reality of computing.

And so, you can choose to be a knowledgeable software developer and be keenly aware of what the words you type on your screen actually do or you can live in ignorance of this and perennially think things are broken.

In C you are responsible for understanding that you are not typing magical words that solve all your problems. You are in charge. An array, as such, is just the address of the starting point of some bunch of numbers you are storing in a chunk of memory. Done. Period.

Past that, one can choose to understand and work with this or saddle a language with all kinds of additional code that removes the programmer from the responsibility of knowing what's going on at the expense of having to execute TONS of UNNECESSARY code every single time one wants to do anything at all. An array ceases to be a chunk-o-data and becomes that plus a bunch of other stuff in memory which, in turn, relies on a pile of code that wraps it into something that a programmer can use without much thought given.

This is how, for example, coding something like a Genetic Algorithm in Objective-C can be hundreds of times slower than re-coding it in C (or C++), where you actually have to mind what you are doing.

To me that's just laziness. Or lack of education. Or both. I have never, ever, had any issues with magical things happening in C because, well, I understand what it is and what it is not. Sure, yeah, I program and have programmed in dozens of languages far more advanced than C, from C++ to APL, LISP, Python, Objective-C and others. And I have found that C --or the language-- is never the problem, it's the programmer that's the problem.

I wonder how much energy the world wastes because of the overhead of "advanced" languages? There's a real cost to this in time, energy and resources.

This reminds me of something completely unrelated to programming. On a visit to windmills in The Netherlands we noted that there were no safety barriers to the spinning gears within the windmill. In the US you would likely have lexan shields protecting people and kids from sticking their hands into a gear. In other parts of the world people are expected to be intelligent and responsible enough to understand the danger, not do stupid things and teach their children the same. Only one of those is a formula for breeding people who will not do dumb things.

Stop trying to fix it. There's nothing wrong with it. Fix the software developer.

rebootthesystem | 7 years ago | on: APL deserves its renaissance too

One of the most powerful aspects of APL is its notation. Ken Iverson himself wrote a paper titled "Notation as a Tool For Thought". Here it is:

http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pd...

I remember watching Iverson deliver a presentation in person about this very topic.

Anyone familiar with fields such as mathematics or music understands the power of notation. The integral symbol conveys information and allows you to think about the problem rather than the mechanics.

APL in early times suffered from a unique problem: You had to physically modify your computer and printer to be able to do APL. You had to remove and replace the character generator ROM from you graphics card (who remembers those cards?). You had to get a new keyboard or put stickers all over your standard keyboard. And you had to change the print wheel or print ball (IBM printers) to be able to see, type and print APL characters.

It was a pain in the ass. Only the most interested cult members endured that level of pain for an extended period of time.

Years later Iverson decided to transliterate APL symbols into combinations of standard ASCII characters. This was a knee-jerk reaction to the above stated problem. What he did not have was the vision to recognize that technology would take care of this on its own. Not long after the introduction of J everyone could display and print graphics of any kind. The APL character set, the symbols, ceased to be a problem in that regard.

Iverson took the wrong road with J out of --conjecture on my part-- commercial interest rather than language interest. He violated something he personally talked about: The value of notation as a tool for thought.

J doesn't need to exist. If we are to evolve APL and move into a world where symbolic programming is a reality (something I think would be very powerful) we need to move away from typing ASCII characters into a keyboard and move into a paradigm where advanced software engineering has it's own notation that can be used to describe problems and create solutions with the kind of expressive power we have not seen in mainstream computing in years.

rebootthesystem | 7 years ago | on: APL deserves its renaissance too

> It's circular reasoning. Languages get those things by being popular.

Maybe it is but it's reality. Also, there's the other kind of reality: Languages don't matter. Solving problems is what matters.

I've programmed in everything from Machine Language (note I did not say "Assembler") to APL, passing through languages like Forth, C, C++, FORTRAN, Objective-C, Lisp, PHP, JS, Python, etc. At the end of the day the ONLY thing that matters --if it isn't a hobby-- is solving problems computationally. I have no cult adherence to any language whatsoever. They are tools, that's all.

My best example of this was making tons of money solving a problem using Visual Basic for Applications, which allowed me to use Excel to automate a time consuming task in a CAD program. It just so happened that this CAD program could be automated using VB. Put the two together and several months of work and we had a tool worth quite a bit of money.

APL still has lots of value...in the right circles. I believe it still sees professional usage in the finance industry.

rebootthesystem | 7 years ago | on: APL deserves its renaissance too

No, it does not. I used APL professioally for about ten years back in the 80's. I love the language. It is incredibly powerful. Once you internalize it's like playing the piano, you don't think about the mechanics you play music.

However, the language did not stand the test of time for far more important issues than the inconvenience of the character set and the keyboard.

And, no, J is not a successor to APL, even though Iverson created it. J is an abomination. He made a mistake. He thought that abandoning notation --which is incredibly powerful-- would solve the APL popularity problem. What he ended-up creating was a royal mess of the first degree. It's garbage.

APL could be very useful today but someone with the time and context needs to organize an effort to evolve it into a modern language that retains the power of what got branded as a "tool for thought" while adding layers of functionality that are sorely missing. I wish I had the time to embark on this journey. I would love to do something like that, but I can't.

Again, the character set and keyboard are not the problem. I used to touch type APL. Didn't take that long to get there. People learn to drive vi/vim. It's a matter of having to have a reason to make the effort.

And the ecosystem. That's another huge issue.

This has two aspects:

Finding qualified programmers and having access to libraries so you don't reinvent the wheel.

Back in the day I used to do a lot of work with Forth as well. Great language for the right applications, but finding qualified Forth programmers was difficult when the language was popular and it became nearly impossible with the passage of time.

APL suffers from the same problem, a seriously limited talent pool.

I probably don't need to explain the value and power of having libraries to support a wide range of applications. Python is a good example of this today. You can find a library to do just about anything you might care to approach with Python, from desktop through embedded and web. In many ways the breath and depth of available libraries an be far more important than language capabilities and sophistication. After all, if you had to write OpenCV from scratch there's no amount of APL magic that is going to make you more efficient and effective than a 15 year old kid with Python and OpenCV.

I see APL mentioned on HN with some frequency. I feel that some here are in love with the idea of APL rather than understanding the reality of APL. Again, I love the language, but there's a reason I stopped using it about 25 years ago.

What's interesting is that C, which I started using way before APL, is still around and very solid language (with lots of libraries) for the right applications.

rebootthesystem | 8 years ago | on: DB-19: Resurrecting an Obsolete Connector (2016)

I have been in physical product development and manufacturing for three decades. Doing business with US manufacturers has become more and more difficult over time. What you describe here is very true and only the tip of the iceberg. I have, for example, sent out 50 requests for quotes for machined components only to be utterly ignored by most of the shops I contacted. The same exercise with China results in an almost overwhelming number of quotes received almost instantly. They are open for business. Have been for a while. I, frankly, have no clue what game we are playing.

rebootthesystem | 8 years ago | on: Tesla Is Facing a Crucible

That's not quite right. You can hold a short position for as long as you want. There is no contract saying you have to close the transaction in 30 days. In that sense the otherwise excellent analogy provided isn't accurate (which isn't a problem, it's meant to be a simplification).

What does happen if the stock goes up is that your brokerage company will ask you to put up the delta. In other words, if it goes from $100 to $110 you'll be required to deposit the equivalent of $10, the delta, times the number of shares you shorted. If you shorted 1,000 shares you'll have to deposit $10,000 for every $10 of upwards movement in the stock price. If you have long (traditional stock buying) positions in your account your broker might actually sell those automatically to cover this delta.

The other important point is that this is a loan. Which means you will pay interest on the funds, in this hypothetical $100,000. The interest charged can vary. If, for the sake of an example, we assume 5% simple annual this means $5,000 per year or just over $400 per month.

I used to day trade (about 20 years ago) and would use shorting multiple times per day. I am not sure I would consider shorting for long term (> 1 day) positions. As many have said, the potential for loss is great.

rebootthesystem | 8 years ago | on: How to Run Your Own Mail Server (2017)

I have to say, I've been running my own mail server without a single problem for years. OK, the way I do it is a stretch of the definition of "running my own web server" but I have full control.

What do I do?

Get a nice VPS from a company like GoDaddy. Setup whatever domains you need. Setup as many email accounts as needed. And off you go. No problem. I don't even have to think about email.

I thought about rolling my own on one of our Linode servers but every time I compare the no-brainer of doing the above to what it would take to run this ourselves on Linodes I can't justify the pain and aggravation.

What I don't like about the Gmail approach (other than it is Gmail and I do not trust Google to not shut down all of our accounts for some stupid reason) is the cost. I can spend a few bucks a month on a VPS and have a hundred email addresses. The same on Gmail would cost significantly more and you would be under their irrational thumb.

A few years ago I looked into running Zimbra on Linode. Back then it was so resource hungry it just didn't make any sense. I wonder if it has gotten any better over time? I really like the concept.

rebootthesystem | 8 years ago | on: Amazon's Fake Review Problem

Amazon has the power to fix this. They don't seem to care. Simple logic and heuristics would be enough.

Amazon allows anyone to post a review. You don't even have to buy the product. That is fundamentally wrong.

That's the first easy step: If you did not buy the product on Amazon you cannot post a review.

Amazon allows people who receive deep discounts to post reviews. That is ripe for manipulation.

And so, the second filter is simple: If someone doesn't pay at least N%, say 50%, for a product they don't get to post a review.

Amazon allows people to post a review at any time, even before the product ships. You can post a review for toothpaste before you actually use it.

The third filter would include a variable purchase-to-review period. The length of this period is different depending on the type of product. Maybe someone who buys a USB cable can post a review a couple of weeks after actually receiving it --but not sooner. Someone buying weight loss pills might need to wait 60 days.

In other words, introduce some common sense into a process that would only allow actual retail buyers of a product to experience the product for a reasonable amount of time before allowing them to post a review, positive or negative.

It goes beyond that. Negative reviews need to be routed to the vendor before they appear publicly on Amazon. Why? Amazon needs to give vendors a first shot at solving the problem. The current system is moronic. People can give a USB cable a bad review because they don't like the color. The thing might work just fine but the person wanted a different shade of green and can give the product a 1 star review. This is nonsense.

In general terms, Amazon reviews, due to Amazon's own incompetence, are pretty much worthless these days.

page 1