top | item 12853137

Curl 7.51.0 Released

181 points| emillon | 9 years ago |curl.haxx.se

32 comments

order

neic|9 years ago

I see that Ubuntu 16.04 LTS have version 7.47.0 [1]. Its been 9 months, 9 releases and at least 15 CVEs since then. I can also see that some of the CVEs was reported to distros@openwall [2]. I (naively) assumed that once this was reported, the package maintainers would update the packages, push a release at the same time as the original developer made a public statement. Then I could just update my system and be done with it.

Where is the fault in this chain? How can I as a maintainer of a few servers be sure my servers are secure without manually patching every package?

[1] http://packages.ubuntu.com/xenial/libcurl3 [2] http://oss-security.openwall.org/wiki/mailing-lists/distros

EDIT: changed "12 CVEs" to "at least 15 CVEs". The changelog don't have CVE-numbers in the title for all of them.

hannob|9 years ago

It's the concept of LTS distributions to stick with one version and only patch important bugfixes and security vulnerabilities.

So if the Ubuntu security team does its job properly then you shouldn't have a reason to worry.

(However given the number of security vulns these days it's often challenging for LTS distributions to backport all security fixes. There are already breakdowns of the LTS concept, e.g. sticking with latest upstream versions for some packages like chromium where backporting is not realistic.)

geofft|9 years ago

If you go to the "Ubuntu Changelog" link on the right, you can see that they've backported three security fixes (CVE-2016-5419, CVE-2016-5420, and CVE-2016-5421) since the 16.04 LTS release.

You are trusting Ubuntu's judgment that the remaining 12 CVEs aren't that important. Ubuntu's security team is pretty good, but I don't think there is any distro that is extremely good. In part this is because a distro is on the hook for compatibility of all the software they ship, and expected to prioritize compatibility over security. Anything other than targeted security fixes can cause regressions.

fuhrysteve|9 years ago

Does anyone have an abbreviated explanation of what the security vulnerabilities that were addressed here? I recall there was a very ominous post to look out for this release because of some nasty stuff they found.

jamies888888|9 years ago

I love cURL. Keep up the good work.

du_bing|9 years ago

What is the biggest usage of Curl? I am new to Linux,sorry.

hannob|9 years ago

The biggest usage is probably not the tool itself, but the library.

It's the de-facto-standard-HTTP-library (although it does more than just HTTP). E.g. if you have a PHP script that downloads some data from another webpage it very often does this with curl.

thejosh|9 years ago

Apart from the oodles of software that depends on curl/libcurl, my favourite thing is to "Copy as cURL" from a Chrome request.

SturgeonsLaw|9 years ago

It's handy for giving an API endpoint a slap without needing a scripting language

72deluxe|9 years ago

The underlying library (libcurl) is extremely useful on all platforms, not just Linux.

0xmohit|9 years ago

I find it very useful for debugging. For example, when you have a web server that redirects requests saying:

  curl --verbose --location URL
or even

  curl --trace --location URL
provides the desired information.

AstroJetson|9 years ago

Data transfer by many protocols using URLs. See their website https://curl.haxx.se for details.

Super tool, we use it all the time for file transfers

gnur|9 years ago

I use for a few things, almost every day.

Getting my outgoing ip address (to check net connectivity): curl ip.mydomain.net or (to force ipv4) curl -4 ip.mydomain.net

To check how a page redirects: curl -v example.com (shows headers)

Interacting with elasticsearch, mainly showing indexes and their health: curl localhost:9200/_cat/indices

dorfsmay|9 years ago

It's used to send an http/HTTPS reqeuests, controlling every aspect of it, form headers to cookies, to ignoring/not SSL certs, and has advanced debugging option to show you the entire dialog (-sv) and use a different IP address (--resolve to test you firewalls, LBs etc...).

erelde|9 years ago

File transfers, getting http headers, getting html content of a page piping it to a script to clean it up and read later (long articles).

That's what I use it for, I think, mainly.

Edit: also between php scripts

kinow|9 years ago

Change log for this release

Fixed in 7.51.0 - November 2 2016

Changes:

    nss: additional cipher suites are now accepted by CURLOPT_SSL_CIPHER_LIST
    New option: CURLOPT_KEEP_SENDING_ON_ERROR 
Bugfixes:

    CVE-2016-8615: cookie injection for other servers
    CVE-2016-8616: case insensitive password comparison
    CVE-2016-8617: OOB write via unchecked multiplication
    CVE-2016-8618: double-free in curl_maprintf
    CVE-2016-8619: double-free in krb5 code
    CVE-2016-8620: glob parser write/read out of bounds
    CVE-2016-8621: curl_getdate read out of bounds
    CVE-2016-8622: URL unescape heap overflow via integer truncation
    CVE-2016-8623: Use-after-free via shared cookies
    CVE-2016-8624: invalid URL parsing with '#'
    CVE-2016-8625: IDNA 2003 makes curl use wrong host
    openssl: fix per-thread memory leak using 1.0.1 or 1.0.2
    http: accept "Transfer-Encoding: chunked" for HTTP/2 as well
    LICENSE-MIXING.md: update with mbedTLS dual licensing
    examples/imap-append: Set size of data to be uploaded
    test2048: fix url
    darwinssl: disable RC4 cipher-suite support
    CURLOPT_PINNEDPUBLICKEY.3: fix the AVAILABILITY formatting
    openssl: don’t call CRYTPO_cleanup_all_ex_data
    libressl: fix version output
    easy: Reset all statistical session info in curl_easy_reset
    curl_global_cleanup.3: don't unload the lib with sub threads running
    dist: add CurlSymbolHiding.cmake to the tarball
    docs: Remove that --proto is just used for initial retrieval
    configure: Fixed builds with libssh2 in a custom location
    curl.1: --trace supports % for sending to stderr!
    cookies: same domain handling changed to match browser behavior
    formpost: trying to attach a directory no longer crashes
    CURLOPT_DEBUGFUNCTION.3: fixed unused argument warning
    formpost: avoid silent snprintf() truncation
    ftp: fix Curl_ftpsendf
    mprintf: return error on too many arguments
    smb: properly check incoming packet boundaries
    GIT-INFO: remove the Mac 10.1-specific details
    resolve: add error message when resolving using SIGALRM
    cmake: add nghttp2 support
    dist: remove PDF and HTML converted docs from the releases
    configure: disable poll() in macOS builds
    vtls: only re-use session-ids using the same scheme
    pipelining: skip to-be-closed connections when pipelining
    win: fix Universal Windows Platform build
    curl: do not set CURLOPT_SSLENGINE to DEFAULT automatically
    maketgz: make it support "only" generating version info
    Curl_socket_check: add extra check to avoid integer overflow
    gopher: properly return error for poll failures
    curl: set INTERLEAVEDATA too
    polarssl: clear thread array at init
    polarssl: fix unaligned SSL session-id lock
    polarssl: reduce #ifdef madness with a macro
    curl_multi_add_handle: set timeouts in closure handles
    configure: set min version flags for builds on mac
    INSTALL: converted to markdown => INSTALL.md
    curl_multi_remove_handle: fix a double-free
    multi: fix inifinte loop in curl_multi_cleanup()
    nss: fix tight loop in non-blocking TLS handhsake over proxy
    mk-ca-bundle: Change URL retrieval to HTTPS-only by default
    mbedtls: stop using deprecated include file
    docs: fix req->data in multi-uv example
    configure: Fix test syntax for monotonic clock_gettime
    CURLMOPT_MAX_PIPELINE_LENGTH.3: Clarify it's not for HTTP/2

teh_klev|9 years ago

Why repeat this here in an inferior format?