| Package | Description |
|---|---|
| com.datasift.client.push | |
| com.datasift.client.push.connectors |
| Modifier and Type | Field and Description |
|---|---|
static OutputType<S3> |
OutputType.S3_OUTPUT |
| Modifier and Type | Method and Description |
|---|---|
static S3 |
PushConnectors.s3()
See the official documentation
Data format delivered:
JSON document.
|
| Modifier and Type | Method and Description |
|---|---|
S3 |
S3.accessKey(String key) |
S3 |
S3.acl(String acl)
The access level of the file after it is uploaded to S3:
private (Owner-only read/write)
public-read (Owner read/write, public read)
public-read-write (Public read/write)
authenticated-read (Owner read/write, authenticated read)
bucket-owner-read (Bucket owner read)
bucket-owner-full-control (Bucket owner full control) * @param acl
|
S3 |
S3.bucket(String bucket) |
S3 |
S3.compression(String format) |
S3 |
S3.deliveryFrequency(int frequency)
The minimum number of seconds you want DataSift to wait before sending data again:
0 (continuous delivery)
10 (10 seconds)
30 (30 seconds)
60 (1 minute)
300 (5 minutes)
In reality, a stream might not have data available after the wait.
|
S3 |
S3.directory(String directory)
optionally set a directory within the configured bucked
|
S3 |
S3.filePrefix(String prefix)
An optional prefix to the filename.
|
S3 |
S3.format(S3.S3OutputFormat format)
Sets the output format for your data
|
S3 |
S3.gzip() |
S3 |
S3.maxSize(int maxSize)
The maximum amount of data that DataSift will send in a single batch:
102400 (100KB)
256000 (250KB)
512000 (500KB)
1048576 (1MB)
2097152 (2MB)
5242880 (5MB)
10485760 (10MB)
20971520 (20MB)
52428800 (50MB)
104857600 (100MB)
209715200 (200MB)
|
S3 |
S3.secretKey(String secret) |
Copyright © 2015. All Rights Reserved.