Soto icon

Soto

DataPipeline

Service object for interacting with AWS DataPipeline service.

public struct DataPipeline: AWSService 

AWS Data Pipeline configures and manages a data-driven workflow called a pipeline. AWS Data Pipeline handles the details of scheduling and ensuring that data dependencies are met so that your application can focus on processing the data.

AWS Data Pipeline provides a JAR implementation of a task runner called AWS Data Pipeline Task Runner. AWS Data Pipeline Task Runner provides logic for common data management scenarios, such as performing database queries and running data analysis using Amazon Elastic MapReduce (Amazon EMR). You can use AWS Data Pipeline Task Runner as your task runner, or you can write your own task runner to provide custom data management.

    <p>AWS Data Pipeline implements two main sets of functionality. Use the first set to create a pipeline 
       and define data sources, schedules, dependencies, and the transforms to be performed on the data. 
       Use the second set in your task runner application to receive the next task ready for processing. 
       The logic for performing the task, such as querying the data, running data analysis, or converting 
       the data from one format to another, is contained within the task runner. The task runner performs 
       the task assigned to it by the web service, reporting progress to the web service as it does so. 
       When the task is done, the task runner reports the final success or failure of the task to the web service.</p>

Inheritance

AWSService

Initializers

init(client:region:partition:endpoint:timeout:byteBufferAllocator:options:)

Initialize the DataPipeline client

public init(
        client: AWSClient,
        region: SotoCore.Region? = nil,
        partition: AWSPartition = .aws,
        endpoint: String? = nil,
        timeout: TimeAmount? = nil,
        byteBufferAllocator: ByteBufferAllocator = ByteBufferAllocator(),
        options: AWSServiceConfig.Options = []
    ) 

Parameters

  • client: AWSClient used to process requests
  • region: Region of server you want to communicate with. This will override the partition parameter.
  • partition: AWS partition where service resides, standard (.aws), china (.awscn), government (.awsusgov).
  • endpoint: Custom endpoint URL to use instead of standard AWS servers
  • timeout: Timeout value for HTTP requests

init(from:patch:)

Initializer required by AWSService.with(middlewares:​timeout:​byteBufferAllocator:​options). You are not able to use this initializer directly as there are no public initializers for AWSServiceConfig.Patch. Please use AWSService.with(middlewares:​timeout:​byteBufferAllocator:​options) instead.

public init(from: DataPipeline, patch: AWSServiceConfig.Patch) 

Properties

client

Client used for communication with AWS

public let client: AWSClient

config

Service configuration

public let config: AWSServiceConfig

Methods

activatePipeline(_:logger:on:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.ActivatePipeline Content-Length:​ 39 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"pipelineId":​ "df-06372391ZG65EXAMPLE"}

public func activatePipeline(_ input: ActivatePipelineInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<ActivatePipelineOutput> 

Validates the specified pipeline and starts processing pipeline tasks. If the pipeline does not pass validation, activation fails.

If you need to pause the pipeline to investigate an issue with a component, such as a data source or script, call DeactivatePipeline.

To activate a finished pipeline, modify the end date for the pipeline and then activate it.

HTTP/1.1 200 x-amzn-RequestId: ee19d5bf-074e-11e2-af6f-6bc7a6be60d9 Content-Type: application/x-amz-json-1.1 Content-Length: 2 Date: Mon, 12 Nov 2012 17:50:53 GMT {}
        </response>
    </examples>

addTags(_:logger:on:)

public func addTags(_ input: AddTagsInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<AddTagsOutput> 

Adds or modifies tags for the specified pipeline.

createPipeline(_:logger:on:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.CreatePipeline Content-Length:​ 91 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"name":​ "myPipeline", "uniqueId":​ "123456789", "description":​ "This is my first pipeline"}
public func createPipeline(_ input: CreatePipelineInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<CreatePipelineOutput> 

Creates a new, empty pipeline. Use PutPipelineDefinition to populate the pipeline.

    <examples>
        <request>
        </request>
        
        <response>
HTTP/1.1 200 x-amzn-RequestId: b16911ce-0774-11e2-af6f-6bc7a6be60d9 Content-Type: application/x-amz-json-1.1 Content-Length: 40 Date: Mon, 12 Nov 2012 17:50:53 GMT {"pipelineId": "df-06372391ZG65EXAMPLE"}
        </response>
    </examples>

deactivatePipeline(_:logger:on:)

public func deactivatePipeline(_ input: DeactivatePipelineInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<DeactivatePipelineOutput> 

Deactivates the specified running pipeline. The pipeline is set to the DEACTIVATING state until the deactivation process completes.

To resume a deactivated pipeline, use ActivatePipeline. By default, the pipeline resumes from the last completed execution. Optionally, you can specify the date and time to resume the pipeline.

deletePipeline(_:logger:on:)

x-amzn-RequestId:​ b7a88c81-0754-11e2-af6f-6bc7a6be60d9 Content-Type:​ application/x-amz-json-1.1 Content-Length:​ 0 Date:​ Mon, 12 Nov 2012 17:​50:​53 GMT Unexpected response:​ 200, OK, undefined
@discardableResult public func deletePipeline(_ input: DeletePipelineInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<Void> 

Deletes a pipeline, its pipeline definition, and its run history. AWS Data Pipeline attempts to cancel instances associated with the pipeline that are currently being processed by task runners.

Deleting a pipeline cannot be undone. You cannot query or restore a deleted pipeline. To temporarily pause a pipeline instead of deleting it, call SetStatus with the status set to PAUSE on individual components. Components that are paused by SetStatus can be resumed.

POST / HTTP/1.1 Content-Type: application/x-amz-json-1.1 X-Amz-Target: DataPipeline.DeletePipeline Content-Length: 50 Host: datapipeline.us-east-1.amazonaws.com X-Amz-Date: Mon, 12 Nov 2012 17:49:52 GMT Authorization: AuthParams {"pipelineId": "df-06372391ZG65EXAMPLE"}
        <response>
        </response>
    </examples>

describeObjects(_:logger:on:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.DescribeObjects Content-Length:​ 98 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"pipelineId":​ "df-06372391ZG65EXAMPLE", "objectIds":​ \["Schedule"\], "evaluateExpressions":​ true}
public func describeObjects(_ input: DescribeObjectsInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<DescribeObjectsOutput> 

Gets the object definitions for a set of objects associated with the pipeline. Object definitions are composed of a set of fields that define the properties of the object.

    <examples>
        <request>
        <response>
x-amzn-RequestId: 4c18ea5d-0777-11e2-8a14-21bb8a1f50ef Content-Type: application/x-amz-json-1.1 Content-Length: 1488 Date: Mon, 12 Nov 2012 17:50:53 GMT {"hasMoreResults": false, "pipelineObjects": \[ {"fields": \[ {"key": "startDateTime", "stringValue": "2012-12-12T00:00:00"}, {"key": "parent", "refValue": "Default"}, {"key": "@sphere", "stringValue": "COMPONENT"}, {"key": "type", "stringValue": "Schedule"}, {"key": "period", "stringValue": "1 hour"}, {"key": "endDateTime", "stringValue": "2012-12-21T18:00:00"}, {"key": "@version", "stringValue": "1"}, {"key": "@status", "stringValue": "PENDING"}, {"key": "@pipelineId", "stringValue": "df-06372391ZG65EXAMPLE"} \], "id": "Schedule", "name": "Schedule"} \] }
        </response>
    </examples>

describePipelines(_:logger:on:)

x-amzn-RequestId:​ 02870eb7-0736-11e2-af6f-6bc7a6be60d9 Content-Type:​ application/x-amz-json-1.1 Content-Length:​ 767 Date:​ Mon, 12 Nov 2012 17:​50:​53 GMT {"pipelineDescriptionList":​ \[ {"description":​ "This is my first pipeline", "fields":​ \[ {"key":​ "@pipelineState", "stringValue":​ "SCHEDULED"}, {"key":​ "description", "stringValue":​ "This is my first pipeline"}, {"key":​ "name", "stringValue":​ "myPipeline"}, {"key":​ "@creationTime", "stringValue":​ "2012-12-13T01:​24:​06"}, {"key":​ "@id", "stringValue":​ "df-0937003356ZJEXAMPLE"}, {"key":​ "@sphere", "stringValue":​ "PIPELINE"}, {"key":​ "@version", "stringValue":​ "1"}, {"key":​ "@userId", "stringValue":​ "924374875933"}, {"key":​ "@accountId", "stringValue":​ "924374875933"}, {"key":​ "uniqueId", "stringValue":​ "1234567890"} \], "name":​ "myPipeline", "pipelineId":​ "df-0937003356ZJEXAMPLE"} \] }
public func describePipelines(_ input: DescribePipelinesInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<DescribePipelinesOutput> 

Retrieves metadata about one or more pipelines. The information retrieved includes the name of the pipeline, the pipeline identifier, its current state, and the user account that owns the pipeline. Using account credentials, you can retrieve metadata about pipelines that you or your IAM users have created. If you are using an IAM user account, you can retrieve metadata about only those pipelines for which you have read permissions.

To retrieve the full pipeline definition instead of metadata about the pipeline, call GetPipelineDefinition.

POST / HTTP/1.1 Content-Type: application/x-amz-json-1.1 X-Amz-Target: DataPipeline.DescribePipelines Content-Length: 70 Host: datapipeline.us-east-1.amazonaws.com X-Amz-Date: Mon, 12 Nov 2012 17:49:52 GMT Authorization: AuthParams {"pipelineIds": ["df-08785951KAKJEXAMPLE"] }

evaluateExpression(_:logger:on:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.DescribePipelines Content-Length:​ 164 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"pipelineId":​ "df-08785951KAKJEXAMPLE", "objectId":​ "Schedule", "expression":​ "Transform started at \#{startDateTime} and finished at \#{endDateTime}"}
public func evaluateExpression(_ input: EvaluateExpressionInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<EvaluateExpressionOutput> 

Task runners call EvaluateExpression to evaluate a string in the context of the specified object. For example, a task runner can evaluate SQL queries stored in Amazon S3.

    <examples>
        <request>
        <response>
x-amzn-RequestId: 02870eb7-0736-11e2-af6f-6bc7a6be60d9 Content-Type: application/x-amz-json-1.1 Content-Length: 103 Date: Mon, 12 Nov 2012 17:50:53 GMT {"evaluatedExpression": "Transform started at 2012-12-12T00:00:00 and finished at 2012-12-21T18:00:00"}

getPipelineDefinition(_:logger:on:)

x-amzn-RequestId:​ e28309e5-0776-11e2-8a14-21bb8a1f50ef Content-Type:​ application/x-amz-json-1.1 Content-Length:​ 890 Date:​ Mon, 12 Nov 2012 17:​50:​53 GMT {"pipelineObjects":​ \[ {"fields":​ \[ {"key":​ "workerGroup", "stringValue":​ "workerGroup"} \], "id":​ "Default", "name":​ "Default"}, {"fields":​ \[ {"key":​ "startDateTime", "stringValue":​ "2012-09-25T17:​00:​00"}, {"key":​ "type", "stringValue":​ "Schedule"}, {"key":​ "period", "stringValue":​ "1 hour"}, {"key":​ "endDateTime", "stringValue":​ "2012-09-25T18:​00:​00"} \], "id":​ "Schedule", "name":​ "Schedule"}, {"fields":​ \[ {"key":​ "schedule", "refValue":​ "Schedule"}, {"key":​ "command", "stringValue":​ "echo hello"}, {"key":​ "parent", "refValue":​ "Default"}, {"key":​ "type", "stringValue":​ "ShellCommandActivity"} \], "id":​ "SayHello", "name":​ "SayHello"} \] }
public func getPipelineDefinition(_ input: GetPipelineDefinitionInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<GetPipelineDefinitionOutput> 

Gets the definition of the specified pipeline. You can call GetPipelineDefinition to retrieve the pipeline definition that you provided using PutPipelineDefinition.

POST / HTTP/1.1 Content-Type: application/x-amz-json-1.1 X-Amz-Target: DataPipeline.GetPipelineDefinition Content-Length: 40 Host: datapipeline.us-east-1.amazonaws.com X-Amz-Date: Mon, 12 Nov 2012 17:49:52 GMT Authorization: AuthParams {"pipelineId": "df-06372391ZG65EXAMPLE"}

listPipelines(_:logger:on:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.ListPipelines Content-Length:​ 14 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {}
public func listPipelines(_ input: ListPipelinesInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<ListPipelinesOutput> 

Lists the pipeline identifiers for all active pipelines that you have permission to access.

    <examples>
        <request>
        <response>
Status: x-amzn-RequestId: b3104dc5-0734-11e2-af6f-6bc7a6be60d9 Content-Type: application/x-amz-json-1.1 Content-Length: 39 Date: Mon, 12 Nov 2012 17:50:53 GMT {"PipelineIdList": \[ {"id": "df-08785951KAKJEXAMPLE", "name": "MyPipeline"}, {"id": "df-08662578ISYEXAMPLE", "name": "MySecondPipeline"} \] }

pollForTask(_:logger:on:)

x-amzn-RequestId:​ 41c713d2-0775-11e2-af6f-6bc7a6be60d9 Content-Type:​ application/x-amz-json-1.1 Content-Length:​ 39 Date:​ Mon, 12 Nov 2012 17:​50:​53 GMT {"taskObject":​ {"attemptId":​ "@SayHello\_2012-12-12T00:​00:​00\_Attempt=1", "objects":​ {"@SayHello\_2012-12-12T00:​00:​00\_Attempt=1":​ {"fields":​ \[ {"key":​ "@componentParent", "refValue":​ "SayHello"}, {"key":​ "@scheduledStartTime", "stringValue":​ "2012-12-12T00:​00:​00"}, {"key":​ "parent", "refValue":​ "SayHello"}, {"key":​ "@sphere", "stringValue":​ "ATTEMPT"}, {"key":​ "workerGroup", "stringValue":​ "workerGroup"}, {"key":​ "@instanceParent", "refValue":​ "@SayHello\_2012-12-12T00:​00:​00"}, {"key":​ "type", "stringValue":​ "ShellCommandActivity"}, {"key":​ "@status", "stringValue":​ "WAITING\_FOR\_RUNNER"}, {"key":​ "@version", "stringValue":​ "1"}, {"key":​ "schedule", "refValue":​ "Schedule"}, {"key":​ "@actualStartTime", "stringValue":​ "2012-12-13T01:​40:​50"}, {"key":​ "command", "stringValue":​ "echo hello"}, {"key":​ "@scheduledEndTime", "stringValue":​ "2012-12-12T01:​00:​00"}, {"key":​ "@activeInstances", "refValue":​ "@SayHello\_2012-12-12T00:​00:​00"}, {"key":​ "@pipelineId", "stringValue":​ "df-0937003356ZJEXAMPLE"} \], "id":​ "@SayHello\_2012-12-12T00:​00:​00\_Attempt=1", "name":​ "@SayHello\_2012-12-12T00:​00:​00\_Attempt=1"} }, "pipelineId":​ "df-0937003356ZJEXAMPLE", "taskId":​ "2xaM4wRs5zOsIH+g9U3oVHfAgAlbSqU6XduncB0HhZ3xMnmvfePZPn4dIbYXHyWyRK+cU15MqDHwdrvftx/4wv+sNS4w34vJfv7QA9aOoOazW28l1GYSb2ZRR0N0paiQp+d1MhSKo10hOTWOsVK5S5Lnx9Qm6omFgXHyIvZRIvTlrQMpr1xuUrflyGOfbFOGpOLpvPE172MYdqpZKnbSS4TcuqgQKSWV2833fEubI57DPOP7ghWa2TcYeSIv4pdLYG53fTuwfbnbdc98g2LNUQzSVhSnt7BoqyNwht2aQ6b/UHg9A80+KVpuXuqmz3m1MXwHFgxjdmuesXNOrrlGpeLCcRWD+aGo0RN1NqhQRzNAig8V4GlaPTQzMsRCljKqvrIyAoP3Tt2XEGsHkkQo12rEX8Z90957XX2qKRwhruwYzqGkSLWjINoLdAxUJdpRXRc5DJTrBd3D5mdzn7kY1l7NEh4kFHJDt3Cx4Z3Mk8MYCACyCk/CEyy9DwuPi66cLz0NBcgbCM5LKjTBOwo1m+am+pvM1kSposE9FPP1+RFGb8k6jQBTJx3TRz1yKilnGXQTZ5xvdOFpJrklIT0OXP1MG3+auM9FlJA+1dX90QoNJE5z7axmK//MOGXUdkqFe2kiDkorqjxwDvc0Js9pVKfKvAmW8YqUbmI9l0ERpWCXXnLVHNmPWz3jaPY+OBAmuJWDmxB/Z8p94aEDg4BVXQ7LvsKQ3DLYhaB7yJ390CJT+i0mm+EBqY60V6YikPSWDFrYQ/NPi2b1DgE19mX8zHqw8qprIl4yh1Ckx2Iige4En/N5ktOoIxnASxAw/TzcE2skxdw5KlHDF+UTj71m16CR/dIaKlXijlfNlNzUBo/bNSadCQn3G5NoO501wPKI:​XO50TgDNyo8EXAMPLE/g==:​1"} }
public func pollForTask(_ input: PollForTaskInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<PollForTaskOutput> 

Task runners call PollForTask to receive a task to perform from AWS Data Pipeline. The task runner specifies which tasks it can perform by setting a value for the workerGroup parameter. The task returned can come from any of the pipelines that match the workerGroup value passed in by the task runner and that was launched using the IAM user credentials specified by the task runner.

If tasks are ready in the work queue, PollForTask returns a response immediately. If no tasks are available in the queue, PollForTask uses long-polling and holds on to a poll connection for up to a 90 seconds, during which time the first newly scheduled task is handed to the task runner. To accomodate this, set the socket timeout in your task runner to 90 seconds. The task runner should not call PollForTask again on the same workerGroup until it receives a response, and this can take up to 90 seconds.

POST / HTTP/1.1 Content-Type: application/x-amz-json-1.1 X-Amz-Target: DataPipeline.PollForTask Content-Length: 59 Host: datapipeline.us-east-1.amazonaws.com X-Amz-Date: Mon, 12 Nov 2012 17:49:52 GMT Authorization: AuthParams {"workerGroup": "MyworkerGroup", "hostname": "example.com"}
        <response>
        </response>
    </examples>

putPipelineDefinition(_:logger:on:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.PutPipelineDefinition Content-Length:​ 914 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"pipelineId":​ "df-0937003356ZJEXAMPLE", "pipelineObjects":​ \[ {"id":​ "Default", "name":​ "Default", "fields":​ \[ {"key":​ "workerGroup", "stringValue":​ "workerGroup"} \] }, {"id":​ "Schedule", "name":​ "Schedule", "fields":​ \[ {"key":​ "startDateTime", "stringValue":​ "2012-12-12T00:​00:​00"}, {"key":​ "type", "stringValue":​ "Schedule"}, {"key":​ "period", "stringValue":​ "1 hour"}, {"key":​ "endDateTime", "stringValue":​ "2012-12-21T18:​00:​00"} \] }, {"id":​ "SayHello", "name":​ "SayHello", "fields":​ \[ {"key":​ "type", "stringValue":​ "ShellCommandActivity"}, {"key":​ "command", "stringValue":​ "echo hello"}, {"key":​ "parent", "refValue":​ "Default"}, {"key":​ "schedule", "refValue":​ "Schedule"} \] } \] }
public func putPipelineDefinition(_ input: PutPipelineDefinitionInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<PutPipelineDefinitionOutput> 

Adds tasks, schedules, and preconditions to the specified pipeline. You can use PutPipelineDefinition to populate a new pipeline.

PutPipelineDefinition also validates the configuration as it adds it to the pipeline. Changes to the pipeline are saved unless one of the following three validation errors exists in the pipeline.

  1. An object is missing a name or identifier field.
  2. A string or reference field is empty.
  3. The number of objects in the pipeline exceeds the maximum allowed objects.
  4. The pipeline is in a FINISHED state.

Pipeline object definitions are passed to the PutPipelineDefinition action and returned by the GetPipelineDefinition action.

Example 1 This example sets an valid pipeline configuration and returns success.
        </request>
        <response>
HTTP/1.1 200 x-amzn-RequestId: f74afc14-0754-11e2-af6f-6bc7a6be60d9 Content-Type: application/x-amz-json-1.1 Content-Length: 18 Date: Mon, 12 Nov 2012 17:50:53 GMT {"errored": false}
        </response>
        </example>
        <example>
            <name>Example 2</name>
            <description>
                This example sets an invalid pipeline configuration (the value for <code>workerGroup</code> is an empty string) and returns an error message.
            </description>
            <request>
POST / HTTP/1.1 Content-Type: application/x-amz-json-1.1 X-Amz-Target: DataPipeline.PutPipelineDefinition Content-Length: 903 Host: datapipeline.us-east-1.amazonaws.com X-Amz-Date: Mon, 12 Nov 2012 17:49:52 GMT Authorization: AuthParams {"pipelineId": "df-06372391ZG65EXAMPLE", "pipelineObjects": \[ {"id": "Default", "name": "Default", "fields": \[ {"key": "workerGroup", "stringValue": ""} \] }, {"id": "Schedule", "name": "Schedule", "fields": \[ {"key": "startDateTime", "stringValue": "2012-09-25T17:00:00"}, {"key": "type", "stringValue": "Schedule"}, {"key": "period", "stringValue": "1 hour"}, {"key": "endDateTime", "stringValue": "2012-09-25T18:00:00"} \] }, {"id": "SayHello", "name": "SayHello", "fields": \[ {"key": "type", "stringValue": "ShellCommandActivity"}, {"key": "command", "stringValue": "echo hello"}, {"key": "parent", "refValue": "Default"}, {"key": "schedule", "refValue": "Schedule"}
  ]
}
\] }
            </request>
            <response>
HTTP/1.1 200 x-amzn-RequestId: f74afc14-0754-11e2-af6f-6bc7a6be60d9 Content-Type: application/x-amz-json-1.1 Content-Length: 18 Date: Mon, 12 Nov 2012 17:50:53 GMT {"\_\_type": "com.amazon.setl.webservice\#InvalidRequestException", "message": "Pipeline definition has errors: Could not save the pipeline definition due to FATAL errors: \[com.amazon.setl.webservice.ValidationError@108d7ea9\] Please call Validate to validate your pipeline"}
            </response>
        </example>
    </examples>

queryObjects(_:logger:on:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.QueryObjects Content-Length:​ 123 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"pipelineId":​ "df-06372391ZG65EXAMPLE", "query":​ {"selectors":​ \[ \] }, "sphere":​ "INSTANCE", "marker":​ "", "limit":​ 10}
public func queryObjects(_ input: QueryObjectsInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<QueryObjectsOutput> 

Queries the specified pipeline for the names of objects that match the specified set of conditions.

    <examples>
        <request>
        <response>
x-amzn-RequestId: 14d704c1-0775-11e2-af6f-6bc7a6be60d9 Content-Type: application/x-amz-json-1.1 Content-Length: 72 Date: Mon, 12 Nov 2012 17:50:53 GMT {"hasMoreResults": false, "ids": \["@SayHello\_1\_2012-09-25T17:00:00"\] }

removeTags(_:logger:on:)

public func removeTags(_ input: RemoveTagsInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<RemoveTagsOutput> 

Removes existing tags from the specified pipeline.

reportTaskProgress(_:logger:on:)

x-amzn-RequestId:​ 640bd023-0775-11e2-af6f-6bc7a6be60d9 Content-Type:​ application/x-amz-json-1.1 Content-Length:​ 18 Date:​ Mon, 12 Nov 2012 17:​50:​53 GMT {"canceled":​ false}
public func reportTaskProgress(_ input: ReportTaskProgressInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<ReportTaskProgressOutput> 

Task runners call ReportTaskProgress when assigned a task to acknowledge that it has the task. If the web service does not receive this acknowledgement within 2 minutes, it assigns the task in a subsequent PollForTask call. After this initial acknowledgement, the task runner only needs to report progress every 15 minutes to maintain its ownership of the task. You can change this reporting time from 15 minutes by specifying a reportProgressTimeout field in your pipeline.

If a task runner does not report its status after 5 minutes, AWS Data Pipeline assumes that the task runner is unable to process the task and reassigns the task in a subsequent response to PollForTask. Task runners should call ReportTaskProgress every 60 seconds.

POST / HTTP/1.1 Content-Type: application/x-amz-json-1.1 X-Amz-Target: DataPipeline.ReportTaskProgress Content-Length: 832 Host: datapipeline.us-east-1.amazonaws.com X-Amz-Date: Mon, 12 Nov 2012 17:49:52 GMT Authorization: AuthParams {"taskId": "aaGgHT4LuH0T0Y0oLrJRjas5qH0d8cDPADxqq3tn+zCWGELkCdV2JprLreXm1oxeP5EFZHFLJ69kjSsLYE0iYHYBYVGBrB+E/pYq7ANEEeGJFnSBMRiXZVA+8UJ3OzcInvXeinqBmBaKwii7hnnKb/AXjXiNTXyxgydX1KAyg1AxkwBYG4cfPYMZbuEbQJFJvv5C/2+GVXz1w94nKYTeUeepwUOFOuRLS6JVtZoYwpF56E+Yfk1IcGpFOvCZ01B4Bkuu7x3J+MD/j6kJgZLAgbCJQtI3eiW3kdGmX0p0I2BdY1ZsX6b4UiSvM3OMj6NEHJCJL4E0ZfitnhCoe24Kvjo6C2hFbZq+ei/HPgSXBQMSagkr4vS9c0ChzxH2+LNYvec6bY4kymkaZI1dvOzmpa0FcnGf5AjSK4GpsViZ/ujz6zxFv81qBXzjF0/4M1775rjV1VUdyKaixiA/sJiACNezqZqETidp8d24BDPRhGsj6pBCrnelqGFrk/gXEXUsJ+xwMifRC8UVwiKekpAvHUywVk7Ku4jH/n3i2VoLRP6FXwpUbelu34iiZ9czpXyLtyPKwxa87dlrnRVURwkcVjOt2Mcrcaqe+cbWHvNRhyrPkkdfSF3ac8/wfgVbXvLEB2k9mKc67aD9rvdc1PKX09Tk8BKklsMTpZ3TRCd4NzQlJKigMe8Jat9+1tKj4Ole5ZzW6uyTu2s2iFjEV8KXu4MaiRJyNKCdKeGhhZWY37Qk4NBK4Ppgu+C6Y41dpfOh288SLDEVx0/UySlqOEdhba7c6BiPp5r3hKj3mk9lFy5OYp1aoGLeeFmjXveTnPdf2gkWqXXg7AUbJ7jEs1F0lKZQg4szep2gcKyAJXgvXLfJJHcha8Lfb/Ee7wYmyOcAaRpDBoFNSbtoVXar46teIrpho+ZDvynUXvU0grHWGOk=:wn3SgymHZM99bEXAMPLE", "fields": [ {"key": "percentComplete", "stringValue": "50"} ] }
        <response>

reportTaskRunnerHeartbeat(_:logger:on:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.ReportTaskRunnerHeartbeat Content-Length:​ 84 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"taskrunnerId":​ "1234567890", "workerGroup":​ "wg-12345", "hostname":​ "example.com"}
public func reportTaskRunnerHeartbeat(_ input: ReportTaskRunnerHeartbeatInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<ReportTaskRunnerHeartbeatOutput> 

Task runners call ReportTaskRunnerHeartbeat every 15 minutes to indicate that they are operational. If the AWS Data Pipeline Task Runner is launched on a resource managed by AWS Data Pipeline, the web service can use this call to detect when the task runner application has failed and restart a new instance.

    <examples>
        <request>           
        <response>
Status: x-amzn-RequestId: b3104dc5-0734-11e2-af6f-6bc7a6be60d9 Content-Type: application/x-amz-json-1.1 Content-Length: 20 Date: Mon, 12 Nov 2012 17:50:53 GMT {"terminate": false}
        </response>
    </examples>

setStatus(_:logger:on:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.SetStatus Content-Length:​ 100 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"pipelineId":​ "df-0634701J7KEXAMPLE", "objectIds":​ \["o-08600941GHJWMBR9E2"\], "status":​ "pause"}
@discardableResult public func setStatus(_ input: SetStatusInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<Void> 

Requests that the status of the specified physical or logical pipeline objects be updated in the specified pipeline. This update might not occur immediately, but is eventually consistent. The status that can be set depends on the type of object (for example, DataNode or Activity). You cannot perform this operation on FINISHED pipelines and attempting to do so returns InvalidRequestException.

    <examples>
        <request>
        <response>
x-amzn-RequestId: e83b8ab7-076a-11e2-af6f-6bc7a6be60d9 Content-Type: application/x-amz-json-1.1 Content-Length: 0 Date: Mon, 12 Nov 2012 17:50:53 GMT Unexpected response: 200, OK, undefined
        </response>
    </examples>

setTaskStatus(_:logger:on:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.SetTaskStatus Content-Length:​ 847 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"taskId":​ "aaGgHT4LuH0T0Y0oLrJRjas5qH0d8cDPADxqq3tn+zCWGELkCdV2JprLreXm1oxeP5EFZHFLJ69kjSsLYE0iYHYBYVGBrB+E/pYq7ANEEeGJFnSBMRiXZVA+8UJ3OzcInvXeinqBmBaKwii7hnnKb/AXjXiNTXyxgydX1KAyg1AxkwBYG4cfPYMZbuEbQJFJvv5C/2+GVXz1w94nKYTeUeepwUOFOuRLS6JVtZoYwpF56E+Yfk1IcGpFOvCZ01B4Bkuu7x3J+MD/j6kJgZLAgbCJQtI3eiW3kdGmX0p0I2BdY1ZsX6b4UiSvM3OMj6NEHJCJL4E0ZfitnhCoe24Kvjo6C2hFbZq+ei/HPgSXBQMSagkr4vS9c0ChzxH2+LNYvec6bY4kymkaZI1dvOzmpa0FcnGf5AjSK4GpsViZ/ujz6zxFv81qBXzjF0/4M1775rjV1VUdyKaixiA/sJiACNezqZqETidp8d24BDPRhGsj6pBCrnelqGFrk/gXEXUsJ+xwMifRC8UVwiKekpAvHUywVk7Ku4jH/n3i2VoLRP6FXwpUbelu34iiZ9czpXyLtyPKwxa87dlrnRVURwkcVjOt2Mcrcaqe+cbWHvNRhyrPkkdfSF3ac8/wfgVbXvLEB2k9mKc67aD9rvdc1PKX09Tk8BKklsMTpZ3TRCd4NzQlJKigMe8Jat9+1tKj4Ole5ZzW6uyTu2s2iFjEV8KXu4MaiRJyNKCdKeGhhZWY37Qk4NBK4Ppgu+C6Y41dpfOh288SLDEVx0/UySlqOEdhba7c6BiPp5r3hKj3mk9lFy5OYp1aoGLeeFmjXveTnPdf2gkWqXXg7AUbJ7jEs1F0lKZQg4szep2gcKyAJXgvXLfJJHcha8Lfb/Ee7wYmyOcAaRpDBoFNSbtoVXar46teIrpho+ZDvynUXvU0grHWGOk=:​wn3SgymHZM99bEXAMPLE", "taskStatus":​ "FINISHED"}
public func setTaskStatus(_ input: SetTaskStatusInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<SetTaskStatusOutput> 

Task runners call SetTaskStatus to notify AWS Data Pipeline that a task is completed and provide information about the final status. A task runner makes this call regardless of whether the task was sucessful. A task runner does not need to call SetTaskStatus for tasks that are canceled by the web service during a call to ReportTaskProgress.

    <examples>
        <request>
        </request>
        
        <response>
x-amzn-RequestId: 8c8deb53-0788-11e2-af9c-6bc7a6be6qr8 Content-Type: application/x-amz-json-1.1 Content-Length: 0 Date: Mon, 12 Nov 2012 17:50:53 GMT {}
        </response>
    </examples>

validatePipelineDefinition(_:logger:on:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.ValidatePipelineDefinition Content-Length:​ 936 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"pipelineId":​ "df-06372391ZG65EXAMPLE", "pipelineObjects":​ \[ {"id":​ "Default", "name":​ "Default", "fields":​ \[ {"key":​ "workerGroup", "stringValue":​ "MyworkerGroup"} \] }, {"id":​ "Schedule", "name":​ "Schedule", "fields":​ \[ {"key":​ "startDateTime", "stringValue":​ "2012-09-25T17:​00:​00"}, {"key":​ "type", "stringValue":​ "Schedule"}, {"key":​ "period", "stringValue":​ "1 hour"}, {"key":​ "endDateTime", "stringValue":​ "2012-09-25T18:​00:​00"} \] }, {"id":​ "SayHello", "name":​ "SayHello", "fields":​ \[ {"key":​ "type", "stringValue":​ "ShellCommandActivity"}, {"key":​ "command", "stringValue":​ "echo hello"}, {"key":​ "parent", "refValue":​ "Default"}, {"key":​ "schedule", "refValue":​ "Schedule"}
public func validatePipelineDefinition(_ input: ValidatePipelineDefinitionInput, logger: Logger = AWSClient.loggingDisabled, on eventLoop: EventLoop? = nil) -> EventLoopFuture<ValidatePipelineDefinitionOutput> 

Validates the specified pipeline definition to ensure that it is well formed and can be run without error.

Example 1 This example sets an valid pipeline configuration and returns success.
  ]
}
\] }
        </request>
        <response>
x-amzn-RequestId: 92c9f347-0776-11e2-8a14-21bb8a1f50ef Content-Type: application/x-amz-json-1.1 Content-Length: 18 Date: Mon, 12 Nov 2012 17:50:53 GMT {"errored": false}
        </response>
        </example>
        <example>
         <name>Example 2</name>
        <description>
            This example sets an invalid pipeline configuration and returns the associated set of validation errors.
        </description>
        
        <request>
POST / HTTP/1.1 Content-Type: application/x-amz-json-1.1 X-Amz-Target: DataPipeline.ValidatePipelineDefinition Content-Length: 903 Host: datapipeline.us-east-1.amazonaws.com X-Amz-Date: Mon, 12 Nov 2012 17:49:52 GMT Authorization: AuthParams {"pipelineId": "df-06372391ZG65EXAMPLE", "pipelineObjects": \[ {"id": "Default", "name": "Default", "fields": \[ {"key": "workerGroup", "stringValue": "MyworkerGroup"} \] }, {"id": "Schedule", "name": "Schedule", "fields": \[ {"key": "startDateTime", "stringValue": "bad-time"}, {"key": "type", "stringValue": "Schedule"}, {"key": "period", "stringValue": "1 hour"}, {"key": "endDateTime", "stringValue": "2012-09-25T18:00:00"} \] }, {"id": "SayHello", "name": "SayHello", "fields": \[ {"key": "type", "stringValue": "ShellCommandActivity"}, {"key": "command", "stringValue": "echo hello"}, {"key": "parent", "refValue": "Default"}, {"key": "schedule", "refValue": "Schedule"}
  ]
}
\] }
        </request>
        <response>
x-amzn-RequestId: 496a1f5a-0e6a-11e2-a61c-bd6312c92ddd Content-Type: application/x-amz-json-1.1 Content-Length: 278 Date: Mon, 12 Nov 2012 17:50:53 GMT {"errored": true, "validationErrors": \[ {"errors": \["INVALID\_FIELD\_VALUE: 'startDateTime' value must be a literal datetime value."\], "id": "Schedule"} \] }
        </response>
            </example>
    </examples>

describeObjectsPaginator(_:logger:on:)

compiler(>=5.5.2) && canImport(_Concurrency)
POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.DescribeObjects Content-Length:​ 98 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"pipelineId":​ "df-06372391ZG65EXAMPLE", "objectIds":​ \["Schedule"\], "evaluateExpressions":​ true}
public func describeObjectsPaginator(
        _ input: DescribeObjectsInput,
        logger: Logger = AWSClient.loggingDisabled,
        on eventLoop: EventLoop? = nil
    ) -> AWSClient.PaginatorSequence<DescribeObjectsInput, DescribeObjectsOutput> 

Gets the object definitions for a set of objects associated with the pipeline. Object definitions are composed of a set of fields that define the properties of the object.

     <examples>
         <request>
         <response>
x-amzn-RequestId: 4c18ea5d-0777-11e2-8a14-21bb8a1f50ef Content-Type: application/x-amz-json-1.1 Content-Length: 1488 Date: Mon, 12 Nov 2012 17:50:53 GMT {"hasMoreResults": false, "pipelineObjects": \[ {"fields": \[ {"key": "startDateTime", "stringValue": "2012-12-12T00:00:00"}, {"key": "parent", "refValue": "Default"}, {"key": "@sphere", "stringValue": "COMPONENT"}, {"key": "type", "stringValue": "Schedule"}, {"key": "period", "stringValue": "1 hour"}, {"key": "endDateTime", "stringValue": "2012-12-21T18:00:00"}, {"key": "@version", "stringValue": "1"}, {"key": "@status", "stringValue": "PENDING"}, {"key": "@pipelineId", "stringValue": "df-06372391ZG65EXAMPLE"} \], "id": "Schedule", "name": "Schedule"} \] }
         </response>
     </examples>
Return PaginatorSequence for operation.

Parameters

- input: Input for request - logger: Logger used flot logging - eventLoop: EventLoop to run this process on

listPipelinesPaginator(_:logger:on:)

compiler(>=5.5.2) && canImport(_Concurrency)
POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.ListPipelines Content-Length:​ 14 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {}
public func listPipelinesPaginator(
        _ input: ListPipelinesInput,
        logger: Logger = AWSClient.loggingDisabled,
        on eventLoop: EventLoop? = nil
    ) -> AWSClient.PaginatorSequence<ListPipelinesInput, ListPipelinesOutput> 

Lists the pipeline identifiers for all active pipelines that you have permission to access.

     <examples>
         <request>
         <response>
Status: x-amzn-RequestId: b3104dc5-0734-11e2-af6f-6bc7a6be60d9 Content-Type: application/x-amz-json-1.1 Content-Length: 39 Date: Mon, 12 Nov 2012 17:50:53 GMT {"PipelineIdList": \[ {"id": "df-08785951KAKJEXAMPLE", "name": "MyPipeline"}, {"id": "df-08662578ISYEXAMPLE", "name": "MySecondPipeline"} \] } Return PaginatorSequence for operation.

Parameters

- input: Input for request - logger: Logger used flot logging - eventLoop: EventLoop to run this process on

queryObjectsPaginator(_:logger:on:)

compiler(>=5.5.2) && canImport(_Concurrency)
POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.QueryObjects Content-Length:​ 123 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"pipelineId":​ "df-06372391ZG65EXAMPLE", "query":​ {"selectors":​ \[ \] }, "sphere":​ "INSTANCE", "marker":​ "", "limit":​ 10}
public func queryObjectsPaginator(
        _ input: QueryObjectsInput,
        logger: Logger = AWSClient.loggingDisabled,
        on eventLoop: EventLoop? = nil
    ) -> AWSClient.PaginatorSequence<QueryObjectsInput, QueryObjectsOutput> 

Queries the specified pipeline for the names of objects that match the specified set of conditions.

     <examples>
         <request>
         <response>
x-amzn-RequestId: 14d704c1-0775-11e2-af6f-6bc7a6be60d9 Content-Type: application/x-amz-json-1.1 Content-Length: 72 Date: Mon, 12 Nov 2012 17:50:53 GMT {"hasMoreResults": false, "ids": \["@SayHello\_1\_2012-09-25T17:00:00"\] } Return PaginatorSequence for operation.

Parameters

- input: Input for request - logger: Logger used flot logging - eventLoop: EventLoop to run this process on

describeObjectsPaginator(_:_:logger:on:onPage:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.DescribeObjects Content-Length:​ 98 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"pipelineId":​ "df-06372391ZG65EXAMPLE", "objectIds":​ \["Schedule"\], "evaluateExpressions":​ true}
public func describeObjectsPaginator<Result>(
        _ input: DescribeObjectsInput,
        _ initialValue: Result,
        logger: Logger = AWSClient.loggingDisabled,
        on eventLoop: EventLoop? = nil,
        onPage: @escaping (Result, DescribeObjectsOutput, EventLoop) -> EventLoopFuture<(Bool, Result)>
    ) -> EventLoopFuture<Result> 

Gets the object definitions for a set of objects associated with the pipeline. Object definitions are composed of a set of fields that define the properties of the object.

     <examples>
         <request>
         <response>
x-amzn-RequestId: 4c18ea5d-0777-11e2-8a14-21bb8a1f50ef Content-Type: application/x-amz-json-1.1 Content-Length: 1488 Date: Mon, 12 Nov 2012 17:50:53 GMT {"hasMoreResults": false, "pipelineObjects": \[ {"fields": \[ {"key": "startDateTime", "stringValue": "2012-12-12T00:00:00"}, {"key": "parent", "refValue": "Default"}, {"key": "@sphere", "stringValue": "COMPONENT"}, {"key": "type", "stringValue": "Schedule"}, {"key": "period", "stringValue": "1 hour"}, {"key": "endDateTime", "stringValue": "2012-12-21T18:00:00"}, {"key": "@version", "stringValue": "1"}, {"key": "@status", "stringValue": "PENDING"}, {"key": "@pipelineId", "stringValue": "df-06372391ZG65EXAMPLE"} \], "id": "Schedule", "name": "Schedule"} \] }
         </response>
     </examples>
Provide paginated results to closure `onPage` for it to combine them into one result. This works in a similar manner to `Array.reduce(_:_:) -> Result`. Parameters: - input: Input for request - initialValue: The value to use as the initial accumulating value. `initialValue` is passed to `onPage` the first time it is called. - logger: Logger used flot logging - eventLoop: EventLoop to run this process on - onPage: closure called with each paginated response. It combines an accumulating result with the contents of response. This combined result is then returned along with a boolean indicating if the paginate operation should continue.

describeObjectsPaginator(_:logger:on:onPage:)

Provide paginated results to closure `onPage`.
public func describeObjectsPaginator(
        _ input: DescribeObjectsInput,
        logger: Logger = AWSClient.loggingDisabled,
        on eventLoop: EventLoop? = nil,
        onPage: @escaping (DescribeObjectsOutput, EventLoop) -> EventLoopFuture<Bool>
    ) -> EventLoopFuture<Void> 

Parameters

- input: Input for request - logger: Logger used flot logging - eventLoop: EventLoop to run this process on - onPage: closure called with each block of entries. Returns boolean indicating whether we should continue.

listPipelinesPaginator(_:_:logger:on:onPage:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.ListPipelines Content-Length:​ 14 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {}
public func listPipelinesPaginator<Result>(
        _ input: ListPipelinesInput,
        _ initialValue: Result,
        logger: Logger = AWSClient.loggingDisabled,
        on eventLoop: EventLoop? = nil,
        onPage: @escaping (Result, ListPipelinesOutput, EventLoop) -> EventLoopFuture<(Bool, Result)>
    ) -> EventLoopFuture<Result> 

Lists the pipeline identifiers for all active pipelines that you have permission to access.

     <examples>
         <request>
         <response>
Status: x-amzn-RequestId: b3104dc5-0734-11e2-af6f-6bc7a6be60d9 Content-Type: application/x-amz-json-1.1 Content-Length: 39 Date: Mon, 12 Nov 2012 17:50:53 GMT {"PipelineIdList": \[ {"id": "df-08785951KAKJEXAMPLE", "name": "MyPipeline"}, {"id": "df-08662578ISYEXAMPLE", "name": "MySecondPipeline"} \] }
Provide paginated results to closure `onPage` for it to combine them into one result. This works in a similar manner to `Array.reduce(_:_:) -> Result`. Parameters: - input: Input for request - initialValue: The value to use as the initial accumulating value. `initialValue` is passed to `onPage` the first time it is called. - logger: Logger used flot logging - eventLoop: EventLoop to run this process on - onPage: closure called with each paginated response. It combines an accumulating result with the contents of response. This combined result is then returned along with a boolean indicating if the paginate operation should continue.

listPipelinesPaginator(_:logger:on:onPage:)

Provide paginated results to closure `onPage`.
public func listPipelinesPaginator(
        _ input: ListPipelinesInput,
        logger: Logger = AWSClient.loggingDisabled,
        on eventLoop: EventLoop? = nil,
        onPage: @escaping (ListPipelinesOutput, EventLoop) -> EventLoopFuture<Bool>
    ) -> EventLoopFuture<Void> 

Parameters

- input: Input for request - logger: Logger used flot logging - eventLoop: EventLoop to run this process on - onPage: closure called with each block of entries. Returns boolean indicating whether we should continue.

queryObjectsPaginator(_:_:logger:on:onPage:)

POST / HTTP/1.1 Content-Type:​ application/x-amz-json-1.1 X-Amz-Target:​ DataPipeline.QueryObjects Content-Length:​ 123 Host:​ datapipeline.us-east-1.amazonaws.com X-Amz-Date:​ Mon, 12 Nov 2012 17:​49:​52 GMT Authorization:​ AuthParams {"pipelineId":​ "df-06372391ZG65EXAMPLE", "query":​ {"selectors":​ \[ \] }, "sphere":​ "INSTANCE", "marker":​ "", "limit":​ 10}
public func queryObjectsPaginator<Result>(
        _ input: QueryObjectsInput,
        _ initialValue: Result,
        logger: Logger = AWSClient.loggingDisabled,
        on eventLoop: EventLoop? = nil,
        onPage: @escaping (Result, QueryObjectsOutput, EventLoop) -> EventLoopFuture<(Bool, Result)>
    ) -> EventLoopFuture<Result> 

Queries the specified pipeline for the names of objects that match the specified set of conditions.

     <examples>
         <request>
         <response>
x-amzn-RequestId: 14d704c1-0775-11e2-af6f-6bc7a6be60d9 Content-Type: application/x-amz-json-1.1 Content-Length: 72 Date: Mon, 12 Nov 2012 17:50:53 GMT {"hasMoreResults": false, "ids": \["@SayHello\_1\_2012-09-25T17:00:00"\] }
Provide paginated results to closure `onPage` for it to combine them into one result. This works in a similar manner to `Array.reduce(_:_:) -> Result`. Parameters: - input: Input for request - initialValue: The value to use as the initial accumulating value. `initialValue` is passed to `onPage` the first time it is called. - logger: Logger used flot logging - eventLoop: EventLoop to run this process on - onPage: closure called with each paginated response. It combines an accumulating result with the contents of response. This combined result is then returned along with a boolean indicating if the paginate operation should continue.

queryObjectsPaginator(_:logger:on:onPage:)

Provide paginated results to closure `onPage`.
public func queryObjectsPaginator(
        _ input: QueryObjectsInput,
        logger: Logger = AWSClient.loggingDisabled,
        on eventLoop: EventLoop? = nil,
        onPage: @escaping (QueryObjectsOutput, EventLoop) -> EventLoopFuture<Bool>
    ) -> EventLoopFuture<Void> 

Parameters

- input: Input for request - logger: Logger used flot logging - eventLoop: EventLoop to run this process on - onPage: closure called with each block of entries. Returns boolean indicating whether we should continue.