The aws
package attempts to provide support for using Amazon Web Services like S3 (storage), SQS (queuing) and others
to Haskell programmers. The ultimate goal is to support all Amazon Web Services.
(If you are viewing this README online, please note that you can find the README for the latest stable version at https://github.com/aristidb/aws/blob/stable/README.org).
Make sure you have a recent GHC installed, as well as cabal-install, and installation should be as easy as:
$ cabal install aws
If you prefer to install from source yourself, you should first get a clone of the aws
repository, and install it from
inside the source directory:
$ git clone https://github.com/aristidb/aws.git
$ cd aws
$ cabal install
The aws package is organised into the general Aws
module namespace, and subnamespaces like Aws.S3
for each Amazon Web
Service. Under each service namespace in turn, there are general support modules and and Aws.<Service>.Commands.<Command>
module for each command. For easier usage, there are the “bundling” modules Aws
(general support), and Aws.<Service>
.
The primary concept in aws is the Transaction, which corresponds to a single HTTP request to the Amazon Web Services.
A transaction consists of a request and a response, which are associated together via the Transaction
typeclass. Requests
and responses are simple Haskell records, but for some requests there are convenience functions to fill in default values
for many parameters.
To be able to access AWS resources, you should put your into a configuration file. (You don’t have to store it in a file,
but that’s how we do it in this example.) Save the following in $HOME/.aws-keys
.
default AccessKeyID SecretKey
You do have to replace AccessKeyID and SecretKey with the Access Key ID and the Secret Key respectively, of course.
Then, copy this example into a Haskell file, and run it with runghc
(after installing aws):
{-# LANGUAGE OverloadedStrings #-}
import qualified Aws
import qualified Aws.S3 as S3
import Data.Conduit (($$+-))
import Data.Conduit.Binary (sinkFile)
import Network.HTTP.Conduit (withManager, responseBody)
main :: IO ()
main = do
{- Set up AWS credentials and the default configuration. -}
cfg <- Aws.baseConfiguration
let s3cfg = Aws.defServiceConfig :: S3.S3Configuration Aws.NormalQuery
{- Set up a ResourceT region with an available HTTP manager. -}
withManager $ \mgr -> do
{- Create a request object with S3.getObject and run the request with pureAws. -}
S3.GetObjectResponse { S3.gorResponse = rsp } <-
Aws.pureAws cfg s3cfg mgr $
S3.getObject "haskell-aws" "cloud-remote.pdf"
{- Save the response to a file. -}
responseBody rsp $$+- sinkFile "cloud-remote.pdf"
You can also find this example in the source distribution in the Examples/
folder.
- I get an error when I try to access my bucket with upper-case characters / a very long name.
Those names are not compliant with DNS. You need to use path-style requests, by setting
s3RequestStyle
in the configuration toPathStyle
. Note that such bucket names are only allowed in the US standard region, so your endpoint needs to be US standard.
- 0.7.5
- Support for http-conduit 1.7 and 1.8
- 0.7.1-0.7.4
- Support for GHC 7.6
- Wider constraints to support newer versions of various dependencies
- Update maintainer e-mail address and project categories in cabal file
- 0.7.0
- Change ServiceConfiguration concept so as to indicate in the type whether this is for URI-only requests (i.e. awsUri)
- EXPERIMENTAL: Direct support for iterated transaction, i.e. such where multiple HTTP requests might be necessary due to e.g. response size limits.
- Put aws functions in ResourceT to be able to safely return Sources and streams.
- simpleAws* does not require ResourceT and converts streams into memory values (like ByteStrings) first.
- Log response metadata (level Info), and do not let all aws runners return it.
- S3:
- GetObject: No longer require a response consumer in the request, return the HTTP response (with the body as a stream) instead.
- Add CopyObject (PUT Object Copy) request type.
- Add Examples cabal flag for building code examples.
- Many more, small improvements.
- 0.6.2
- Properly parse Last-Modified header in accordance with RFC 2616.
- 0.6.1
- Fix for MD5 encoding issue in S3 PutObject requests.
- 0.6.0
- API Cleanup
- General: Use Crypto.Hash.MD5.MD5 when a Content-MD5 hash is required, instead of ByteString.
- S3: Made parameter order to S3.putObject consistent with S3.getObject.
- Updated dependencies:
- conduit 0.5 (as well as http-conduit 1.5 and xml-conduit 1.0).
- http-types 0.7.
- Minor changes.
- Internal changes (notable for people who want to add more commands):
- http-types’ new ‘QueryLike’ interface allows creating query lists more conveniently.
- API Cleanup
- 0.5.0
-
New configuration system: configuration split into general and service-specific parts.
Significantly improved API reference documentation.
Re-organised modules to make library easier to understand.
Smaller improvements.
- 0.4.1
- Documentation improvements.
- 0.4.0.1
- Change dependency bounds to allow the transformers 0.3 package.
- 0.4.0
- Update conduit to 0.4.0, which is incompatible with earlier versions.
- 0.3.2
- Add awsRef / simpleAwsRef request variants for those who prefer an
IORef
over aData.Attempt.Attempt
value. Also improve README and add simple example.
- aws on Github
- aws on Hackage (includes reference documentation)
- Official Amazon Web Services website
Name | Github | Company | Components | |
---|---|---|---|---|
Aristid Breitkreuz | aristidb | [email protected] | - | Maintainer |
Bas van Dijk | basvandijk | [email protected] | Erudify AG | S3 |
David Vollbracht | qxjit | |||
Felipe Lessa | meteficha | [email protected] | currently secret | Core, S3, SES |
Nathan Howell | NathanHowell | [email protected] | Alpha Heavy Industries | S3 |
Ozgun Ataman | ozataman | [email protected] | Soostone Inc | Core, S3 |
Steve Severance | sseveran | [email protected] | Alpha Heavy Industries | S3, SQS |
John Wiegley | jwiegley | [email protected] | FP Complete | S3 |