top | item 12921193

(no title)

greendragon | 9 years ago

Curious what they weren't so happy about with Python? Was it purely performance? If so, did they consider PyPy, or at least profile what the slowest bits are so they can evaluate whether to throw everything out or just rewrite the slow bits? Was it the language itself? Not everyone likes dynamic languages, though it's odd they started with it. Did you consider Node at all?

discuss

order

snovv_crash|9 years ago

From my time doing server backend python dev, it is only catching any problems at runtime, everything from missing arguments to typos in variable names that accidentally match another variable, turning your int to a string. Having a compiler catch these saves much time and hairpulling. And having unit tests as a final defense, rather than the only defense, does wonders for my peace of mind.

greendragon|9 years ago

As a Python user and fan I hear this complaint a lot. I understand but I can't really agree since things like typos and type fails pretty much never happen to me, at least in production. My secret? I use the REPL, heavily. (And not even in the grand Lisp fashion, because Python's REPL isn't very advanced, mostly I use it just off to the side and maybe or maybe not running an instance of the full program, or parts of it.) Using the REPL catches most of those things just as quickly as a compiler, plus it can catch things compilers don't, such as null pointer exceptions.

Two lesser secrets are using a linter, which catches all sorts of issues too, and second actually getting the full program locally to a state where I can have it execute (most of) the new code I just wrote that I didn't verify in the REPL, or using data sources I didn't just define temporarily in the REPL, so I can make sure it seems to do what I intended. A lot of devs don't seem to do the second bit... Checked in code for Java compiles and passes existing tests and went through a basic code review but inevitably bugs get filed because it doesn't actually do everything the story said, it's like they didn't even try out their own code, it just looked correct and the compiler/tests agreed.

I think when you're working with the REPL interactively instead of relying on the common "edit -> save -> compile -> ship it|start over" cycle you don't miss those details as much, because you're constantly trying out your own code. Maybe my experience is because I don't typically use dynamic languages as scripting languages, at least in the sense of quickly hacking up a script, saving, getting to skip the compile step (look how much faster it is to develop in dynamic languages!!!), and running it until it works. I have done that, but even then, I'm usually writing the bulk of the script in the REPL -- or rather in my editor that can send text to the REPL. It's quite different from what seems to be the thing that made these languages popular to begin with, which is not having to explicitly type everything and getting to skip a (potentially long) compile step (which also encourages more source sharing).

fizzbatter|9 years ago

Type safety mainly, i think. Performance is a definite concern, but they have a lot of internal applications and the stability of them varies. I offered up that less dynamic languages would provide more speed and reliability to boot.

I know Python got types in 3.5, though i'm not sure if it has Go-like Interfaces (Traits in Rust). If not, i think it really should.

I do firmly believe they'll be quite happy with Go though. Rust, not so much.

greendragon|9 years ago

Seems a lot of the Go fans I read are former Python users burned by dynamic typing, so I agree they'll end up happy (or at least happier than Rust) with Go. Though one more option you might want to consider is Nim: http://nim-lang.org/ (It's pretty easy to get up to speed in it, especially for a Python user so long as they're not expecting to use fancy OO features.)

jerf|9 years ago

Python has always had Go-like interfaces in practice. The problem was that they were not reified into the code, so you had no easy way to know when calling a function and passing it a "file" exactly what file-like things the function was going to do with that "file" without reading the source code. You had to extract the interface yourself.

lelandbatey|9 years ago

For me, it's the extremely straightforward conventions of Go, with it's straightforward tooling, and it's strong typing.

I "grew up" on Python, wrote a lot of code in it, and love it. But it doesn't feel as cohesive as Go.

As an example of cohesive tool design, let's look at Go package management. In Go, if I want to install a package, I install it with:

    $ go get github.com/pkg/term
Having installed this package, I import it in my code with:

    import "github.com/pkg/term"
Having imported this package, I'd like to read the documentation for it. To do that I use the command `go doc` with the package name:

    $ go dock github.com/pkg/term
Now that I've read the docs, I've got a question about how some particular functionality is implemented. With Go, I happen to know exactly where I can read that code, on my own hard drive:

    $ cd $GOPATH/src/github.com/pkg/term
With Python, I find that I don't have this absolute guarantee of consistency. Usually, packages will have a similar convention, but some require installing with one name and importing with another, and the local documentation viewer (pydoc) isn't installed by default, so I didn't even know about it until relatively late in my use of Python. I've had a similar experience with the rest of Python's tooling: it's as feature complete or better than Go's, but it's not quite as consistent as Go.

greendragon|9 years ago

It was pretty bad that easy_install came out with no easy_uninstall. Plus some packages are in your system's package manager (which I think is great because I'm sick of every language having its own package manager when my system's (Gentoo) is better) and some aren't, or the latest versions aren't. Plus there's the virtualenv stuff, or the general problem of your dev environment not matching the deploy environment. Needing to have both Python2 and Python3 on your system in some cases. Some packages have C/C++ code so you need a compiler, and all the dependencies that implies. On Windows I think Python development is a joke, last time I did anything extensive there I think I ended up installing Enthought's distribution and picked off from http://www.lfd.uci.edu/~gohlke/pythonlibs/ as needed. I don't see how the Go situation on Windows could be worse than that.

I'm not a huge stickler for non-local consistency -- one of the things I like about Nim is its apathy about naming conventions (foobar is the same symbol as foo_bar or fooBar, func(arg) is the same as arg.func()...) -- so that's probably why I don't find the consistency factor a huge issue. When a language and its ecosystem has it, it's nice, but when it doesn't, it's not really a thing that annoys me.

RBerenguel|9 years ago

Can't speak for the OP, but when you go full type checking it's hard to go back. Our infrastructure has many pieces in Python (right now I'm rewriting some) but all new APIs are in Go. The amount of trouble you don't even get to fight with type checking is huge. Performance gains are also good in many cases. Slightly more verbosity is a minor price to pay.