Requiring cURL in Your PHP Library

I sometimes hear that people don’t want to use Guzzle (a PHP HTTP client) because it requires cURL and they want their library to be “portable”. In this post, I’ll attempt to convince you that cURL is the best option for sending HTTP requests in PHP, compare cURL against more “portable” PHP alternatives, and prove that your users will probably already have cURL installed on their systems.

“Portable” cURL alternatives

First off, let’s define “portable”. Most of the people that throw this word around imply that if it isn’t part of PHP’s core then it isn’t portable. Ok, so what are the different ways to send HTTP requests using only things provided in PHP’s core distribution?

HTTP stream wrapper

The most common alternative to requiring cURL in a PHP application is to rely on PHP’s HTTP stream wrapper. If your requirements are very limited, then this might be an OK alternative for you. There are some drawbacks to using the PHP HTTP stream wrapper that you should know about before ditching cURL for it:

  • Does not support HTTP 1.1
  • Does not support streaming uploads. Uploads for POST, PUT, etc must be from a string loaded into memory. Try uploading a 2GB file with fopen().
  • Does not support persistent HTTP connections. Opening and closing TCP connections over and over can be a massive performance penalty to an application that makes several requests to the same server.
  • Does not support the fine-grained timeout and speed limit options of cURL
  • Does not maintain cookies between requests. You would need to implement cookie management manually.
  • Does not support sending requests in parallel. Sending requests in parallel can provide significant performance gains to applications that need to send many requests at once.
  • It’s slower than cURL


Creating a PHP HTTP client using sockets is another alternative. When I hear someone is writing a socket based HTTP client from scratch, my first thought is, “have you seen RFC 2616?!” There’s an awful lot of context, state transitions, and edge cases to consider when implementing a socket based HTTP client. Because of the complexity, developers either rarely get it right or omit swaths of HTTP/1.1 features because they are hard to implement (e.g. Expect: 100-Continue, trailing headers, multi-part messages, etc). The one PHP HTTP/1.1 socket based client that actually does a decent job sadly lacks sending requests in parallel.

There are a great deal of edge cases to consider when implementing a socket based HTTP client:

  • What if the remote server is not listening on the specified port?
  • What if the remote server takes too long to respond?
  • What if the HTTP response message is not a valid HTTP response?
  • What if the HTTP response states that it will send more data than it actually sends?
  • What if the connection is severed in the middle of the request?
  • What if a request with an Expect: 100-Continue header never receives a 100 Continue response?
  • What if you need to implement persistent connections? Did the remote server send back a Connection: close header? Did the connection close unannounced?
  • What if the remote server responds with chunked Transfer-Encoding?
  • Will you implement 300 level redirects? Will you gracefully handle Location headers that use relative URLs?
  • What if you need to maintain a cookie session between requests?
  • What if you need to use Digest authentication?
  • How will you implement persistent connections?
  • Will you support sending requests in parallel?

Need more examples that prove HTTP/1.1 is complicated? Check this out.

Trying to implement a socket based HTTP client in PHP that has a comparable feature set to cURL is a monumental undertaking. If you choose to go down this path, then Godspeed.

How ubiquitous is cURL?

Daniel Stenberg, the author of libcurl, recently wrote about this very topic. Daniel estimates that there are probably around 550,000,000 direct or indirect libcurl users. His first number was around 300M, but he realized libcurl is installed on all iOS devices. That number is just an estimate of the number of unique cURL users, so let’s narrow this down to more PHP specific data.

Shared hosts that provide cURL by default

Creating a library that requires cURL expects that your users either already have cURL installed or can install PHP’s cURL extension. The scariest part about requiring users to install cURL is that some of your users will be on a shared server without shell or root access.

With that in mind, I quickly compiled a list of shared hosting companies that include PHP’s cURL extension by default on all of their servers.

Let me know in the comments if you can think of other shared hosting providers that provide PHP’s cURL extension by default.

Who uses cURL?

PHP’s cURL extension is utilized by countless PHP libraries. Each and every one of these library authors decided that requiring cURL for their library was an acceptable requirement.

API clients that require cURL

Lots of large companies offer PHP SDKs for their web services that require cURL. Any developer that utilizes any of the following libraries have installed cURL on their system:

Who else uses cURL?

Some really popular frameworks and libraries require cURL. Developers that use or will use any of the following libraries will likely have cURL installed on their system:

Installing php-curl

Installing cURL is usually really, really easy.

  • Ubuntu: apt-get install php5-curl
  • Fedora / Amazon Linux: yum install php-curl
  • WAMP

    1. Left-click on the WAMP server icon in the bottom right of the screen
    2. PHP -> PHP Extensions -> php_curl


cURL is everywhere. Extremely popular PHP libraries already require cURL. Most shared PHP hosts support cURL by default. Requiring cURL in your PHP library will not detract a statistically significant number of users to the point of justifying resorting to the underpowered PHP HTTP stream wrapper or the frivolous wheel-reinvention that is creating a socket based HTTP client.