sinofsky's comments

sinofsky | 8 years ago | on: Nikon versus Canon: A Story of Technology Change

Yes. I should not have left out image stabilization which would have been even more beneficial to film :-)

Also a great point about R&D budget during the digital transition. Canon definitely was going "in house" for digital whereas Nikon viewed digital almost as Kodak did--a different film back.

sinofsky | 10 years ago | on: Big Company vs. Startup Work and Compensation

You could apply this same model to IBM very easily if you started from college any time from 1993 (April to be precise) and 2013. If you just worked your way up the promotion ladder you would have done super well financially (better than the SP500 by 4-5x).

The problem with any comparison is hindsight. Many other large cap tech stocks do not behave that way. Certainly you could have joined Microsoft at a certain time and "given up" at the "wrong time". Your cash comp might have been quite good (or not).

The thing to ask yourself (and I definitely am not judging or implying one way or another) is where does your code go and what does it do? A lot of that IBM code, well... And of course many startups don't make it as well.

If all you want is money with low risk the choice is clear. If all you want is a chance at a huge payoff with high risk then the choice is clear.

If you're happy with the end result of your code then in a Fountainhead sort of way that's what matters. Then whichever risk path you take for your compensation is secondary. And your views on this might change at different life stages.

Finally the ability and skills to navigate and succeed in either a startup or a big company are different. Depending on when you choose and how you grow and evolve you may or may not be at the right place at the right time.

That's my shortest comment to a very long topic :)

sinofsky | 11 years ago | on: Object Oriented Programming is Inherently Harmful

These are some great quotes. Caveat: I am old. Programming methodology debates wear me out as quickly as language or commenting convention debates :)

I certainly remember many of the quotes when they first came about. I’m a product of the object-oriented wave. At the time I felt I had all the arguments as to why things were so much better with (or so much worse without) OO. In many cases, looking back I realized I was basically arguing for better tools and/or slightly better adherence to a few conventions.

My lessons from going through each “language” transition from debates over “assembler is the only way” through today’s DSLs and more:

1. OO wasn’t good or bad intrinsically. The principles, however, can easily lead to more manageable or coherent code over time. By and large, inheritance, polymorphism, encapsulation, abstraction all form the foundation of large scale systems.

2. Languages can do a great deal of harm, not concepts. Far too often, engineers dive into a new language or paradigm and assume all the code needs to exhibit all the properties of the new religion. In the early days of C++ programming, the saying we had on our projects was “a framework is not a compiler test suite”. I think in all languages, especially today, the risk is that you more harm than benefit when you try to do everything in some fancy new way. Maybe there is a role of operator overloading or templates in C++ but I never really found it. But I am pretty certain nearly every framework employed these techniques. You can’t blame the language because every language has stuff you can abuse. You can blame the zealots or evangelists which often cause the most challenges.

3. Language and paradigm innovation benefits rarely scale in very large systems, but small systems early on exhibit amazing benefits. Most engineers are seduced by new languages and paradigms (OO was just the one that came after structured programming and before functional and others). In the beginning, the new language or approach is amazing. Always amazing. Over time the real world shows up and every new engineer feels that the code is bloated and needs a rewrite when they join a project. Efficiency declines. The magic fades as reality dominates. With more than a few engineers the complexity of interconnection between parts of the code base trumps the simplicity and elegance within one part of the architecture. Expressing those in paradigm or language elegant ways approaches a very high degree of difficulty over time. At one extreme we see competitions of “hello world” or the most basic app all being amazingly simple. At the other extreme we see a constant breakdown in even the most basic methodological approaches. Even “Goto” was hard to do without and certainly in OO maintaining a pure inheritance model, public/private data, or more become as close to impossible.

4. In algorithmic complexity terms, a language or paradigm is at best a constant factor improvement over any other choice. The age-old rule of thumb is that programmer productivity is language independent. While I have no doubt that one could not spin up a new social network in assembler, one would be equally hard pressed to write a device driver or graphics runtime in Ruby or Python. Part of why methodologies gain attention is because the runtimes/libraries/frameworks that come with them do the things that need to be done the way that people want them to work today---that’s what gives the appearance of improved productivity. The right libraries in C can serve a great purpose. We see this when a language gets a new library that seems to bring renewed interest to it.

5. Tools are everything. What makes or breaks a paradigm/language are tools. You can take a simple language or paradigm and have great tools and become much more productive than a “better” paradigm with poor or ill-suited tools. One way that this surfaced over time was with tools that generated the right code—interface builders for example. Then using complex, archaic, or intensely manual approaches lacking formal foundations would become much easier. Plus the bonus of transparency of code generation really helped because other tools could easily integrate (having access to a whole tool system is also more productive than any one methodology+tool).

Ultimately, I think OO is perfectly good and most all modern systems make use of the 4 basic pillars of the paradigm. I don’t think it ever became the answer to code reuse or code quality that proponents claimed. Ultimately the methodology is going to be trumped by scale and age of code and system. Any success means your ability to start over is reduced and so the best bet is to focus on knowing what principles your project is being created and run with.

sinofsky | 11 years ago | on: Mobile is Eating the World

It is me. I've always participated in forums like HN while at Microsoft (and now). I think you can even find me on USENET archives :-)

sinofsky | 11 years ago | on: Mobile is Eating the World

History shows that in the short term the devices that are new have characteristics of the generation being replaced. But then they break away and things get redefined. The first word processors looked like typewriters. The first presentation packages focused on 35mm slides and "foils".

sinofsky | 11 years ago | on: Mobile is Eating the World

At every moment of disruption in technology people saying that the new technology doesn't replace the incumbent. By definition disruptive technologies are less functional and inadequate "replacements".

First, folks tend to talk about all the things that the new technology can't do that the old one does do. In the Steve Jobs interview at All Things D referenced in the comments, he goes on to talk about how software needs to get written--"it is just software" he says. In the near term history we have seen this same dynamic in the advent of the GUI relative to CUI or in the way browser/HTML subsumed the GUI client-server apps. People are writing more code all the time that is "mobile only" even if some of it reinvents or reimagines the desktop/laptop world. I was struck by Adobe's recent developer conference where they showed many mobile apps. As an always aspiring photog we can see how the field is transitioning.

Second, people tend to underestimate the way that new tools, as ineffective as they are, drive changes in the very definition of work. Said another way, people forget that tools can also define the work and jobs people have. It isn't like work was always "mail around a 10MB presentation before the meeting". In fact a long time ago meeting agendas were typed out in courier by a typist -- that job was defined by the Selectric. The tools that created presentations, attachments, and follow up email defined a style of working. While we're reading all this, the exponential rise of mobile is changing what it means to work--to go to a meeting, to collaborate, to decide, to create, etc.

What is so fascinating about this transition is that we might be seeing a divide where creators of tools will use different tools, at least for some time, than the masses that use tools. Let's not project the needs of developers on to the whole space. We might reach a point where different tools are needed. Two years ago I might have said this applies to a lot of fields, but the rapid rise of mobile and tablet based software for many things is making that argument weak. Cash registers, MRI machines, video annotation, and more are all scenarios I have seen recently where one might have said "needs a real OS" or "this need sa full PC". As with the the idea of underestimating software, our own desire to find an anchor pushes us to view things through a lens where our own work doesn't change.

All of this is happening. In parts of the world they are skipping over PCs (Africa and China). Everyone is seeing their time in front of a screen go up enormous amounts and most of that is additive, but for many there is a substitute effect. This doesn't happen overnight or for everyone. TO deny it though is to deny the very changes that led to supporting the idea that the mouse, overlapping windows, and color once displaced other technologies where people said those were not substitutes for the speed, efficiency, or capabilities of what was in use.

page 1