npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@datafire/amazonaws_firehose

v5.0.0

Published

DataFire integration for Amazon Kinesis Firehose

Downloads

25

Readme

@datafire/amazonaws_firehose

Client library for Amazon Kinesis Firehose

Installation and Usage

npm install --save @datafire/amazonaws_firehose
let amazonaws_firehose = require('@datafire/amazonaws_firehose').create({
  accessKeyId: "",
  secretAccessKey: "",
  region: ""
});

amazonaws_firehose.CreateDeliveryStream({
  "DeliveryStreamName": ""
}).then(data => {
  console.log(data);
});

Description

Amazon Kinesis Firehose API Reference Amazon Kinesis Firehose is a fully managed service that delivers real-time streaming data to destinations such as Amazon Simple Storage Service (Amazon S3), Amazon Elasticsearch Service (Amazon ES), and Amazon Redshift.

Actions

CreateDeliveryStream

amazonaws_firehose.CreateDeliveryStream({
  "DeliveryStreamName": ""
}, context)

Input

Output

DeleteDeliveryStream

amazonaws_firehose.DeleteDeliveryStream({
  "DeliveryStreamName": ""
}, context)

Input

Output

DescribeDeliveryStream

amazonaws_firehose.DescribeDeliveryStream({
  "DeliveryStreamName": ""
}, context)

Input

Output

ListDeliveryStreams

amazonaws_firehose.ListDeliveryStreams({}, context)

Input

Output

PutRecord

amazonaws_firehose.PutRecord({
  "DeliveryStreamName": "",
  "Record": {
    "Data": ""
  }
}, context)

Input

Output

PutRecordBatch

amazonaws_firehose.PutRecordBatch({
  "DeliveryStreamName": "",
  "Records": []
}, context)

Input

Output

UpdateDestination

amazonaws_firehose.UpdateDestination({
  "DeliveryStreamName": "",
  "CurrentDeliveryStreamVersionId": "",
  "DestinationId": ""
}, context)

Input

Output

Definitions

AWSKMSKeyARN

  • AWSKMSKeyARN string

BooleanObject

  • BooleanObject boolean

BucketARN

  • BucketARN string

BufferingHints

  • BufferingHints object: Describes hints for the buffering to perform before delivering data to the destination. Please note that these options are treated as hints, and therefore Kinesis Firehose may choose to use different values when it is optimal.

CloudWatchLoggingOptions

ClusterJDBCURL

  • ClusterJDBCURL string

CompressionFormat

  • CompressionFormat string (values: UNCOMPRESSED, GZIP, ZIP, Snappy)

ConcurrentModificationException

  • ConcurrentModificationException object: Another modification has already happened. Fetch VersionId again and use it to update the destination.

CopyCommand

CopyOptions

  • CopyOptions string

CreateDeliveryStreamInput

CreateDeliveryStreamOutput

Data

  • Data string

DataTableColumns

  • DataTableColumns string

DataTableName

  • DataTableName string

DeleteDeliveryStreamInput

DeleteDeliveryStreamOutput

  • DeleteDeliveryStreamOutput object

DeliveryStartTimestamp

  • DeliveryStartTimestamp string

DeliveryStreamARN

  • DeliveryStreamARN string

DeliveryStreamDescription

DeliveryStreamName

  • DeliveryStreamName string

DeliveryStreamNameList

DeliveryStreamStatus

  • DeliveryStreamStatus string (values: CREATING, DELETING, ACTIVE)

DeliveryStreamType

  • DeliveryStreamType string (values: DirectPut, KinesisStreamAsSource)

DeliveryStreamVersionId

  • DeliveryStreamVersionId string

DescribeDeliveryStreamInput

DescribeDeliveryStreamInputLimit

  • DescribeDeliveryStreamInputLimit integer

DescribeDeliveryStreamOutput

DestinationDescription

DestinationDescriptionList

DestinationId

  • DestinationId string

ElasticsearchBufferingHints

ElasticsearchBufferingIntervalInSeconds

  • ElasticsearchBufferingIntervalInSeconds integer

ElasticsearchBufferingSizeInMBs

  • ElasticsearchBufferingSizeInMBs integer

ElasticsearchDestinationConfiguration

ElasticsearchDestinationDescription

ElasticsearchDestinationUpdate

ElasticsearchDomainARN

  • ElasticsearchDomainARN string

ElasticsearchIndexName

  • ElasticsearchIndexName string

ElasticsearchIndexRotationPeriod

  • ElasticsearchIndexRotationPeriod string (values: NoRotation, OneHour, OneDay, OneWeek, OneMonth)

ElasticsearchRetryDurationInSeconds

  • ElasticsearchRetryDurationInSeconds integer

ElasticsearchRetryOptions

  • ElasticsearchRetryOptions object: Configures retry behavior in case Kinesis Firehose is unable to deliver documents to Amazon ES.

ElasticsearchS3BackupMode

  • ElasticsearchS3BackupMode string (values: FailedDocumentsOnly, AllDocuments)

ElasticsearchTypeName

  • ElasticsearchTypeName string

EncryptionConfiguration

ErrorCode

  • ErrorCode string

ErrorMessage

  • ErrorMessage string

ExtendedS3DestinationConfiguration

ExtendedS3DestinationDescription

ExtendedS3DestinationUpdate

HECAcknowledgmentTimeoutInSeconds

  • HECAcknowledgmentTimeoutInSeconds integer

HECEndpoint

  • HECEndpoint string

HECEndpointType

  • HECEndpointType string (values: Raw, Event)

HECToken

  • HECToken string

IntervalInSeconds

  • IntervalInSeconds integer

InvalidArgumentException

  • InvalidArgumentException object: The specified input parameter has a value that is not valid.

KMSEncryptionConfig

  • KMSEncryptionConfig object: Describes an encryption key for a destination in Amazon S3.

KinesisStreamARN

  • KinesisStreamARN string

KinesisStreamSourceConfiguration

  • KinesisStreamSourceConfiguration object: The stream and role ARNs for a Kinesis stream used as the source for a delivery stream.

KinesisStreamSourceDescription

LimitExceededException

  • LimitExceededException object: You have already reached the limit for a requested resource.

ListDeliveryStreamsInput

ListDeliveryStreamsInputLimit

  • ListDeliveryStreamsInputLimit integer

ListDeliveryStreamsOutput

LogGroupName

  • LogGroupName string

LogStreamName

  • LogStreamName string

NoEncryptionConfig

  • NoEncryptionConfig string (values: NoEncryption)

NonNegativeIntegerObject

  • NonNegativeIntegerObject integer

Password

  • Password string

Prefix

  • Prefix string

ProcessingConfiguration

Processor

ProcessorList

ProcessorParameter

ProcessorParameterList

ProcessorParameterName

  • ProcessorParameterName string (values: LambdaArn, NumberOfRetries, RoleArn, BufferSizeInMBs, BufferIntervalInSeconds)

ProcessorParameterValue

  • ProcessorParameterValue string

ProcessorType

  • ProcessorType string (values: Lambda)

PutRecordBatchInput

PutRecordBatchOutput

PutRecordBatchRequestEntryList

  • PutRecordBatchRequestEntryList array

PutRecordBatchResponseEntry

  • PutRecordBatchResponseEntry object: Contains the result for an individual record from a PutRecordBatch request. If the record is successfully added to your delivery stream, it receives a record ID. If the record fails to be added to your delivery stream, the result includes an error code and an error message.

PutRecordBatchResponseEntryList

PutRecordInput

PutRecordOutput

PutResponseRecordId

  • PutResponseRecordId string

Record

  • Record object: The unit of data in a delivery stream.

RedshiftDestinationConfiguration

RedshiftDestinationDescription

RedshiftDestinationUpdate

RedshiftRetryDurationInSeconds

  • RedshiftRetryDurationInSeconds integer

RedshiftRetryOptions

  • RedshiftRetryOptions object: Configures retry behavior in case Kinesis Firehose is unable to deliver documents to Amazon Redshift.

RedshiftS3BackupMode

  • RedshiftS3BackupMode string (values: Disabled, Enabled)

ResourceInUseException

  • ResourceInUseException object: The resource is already in use and not available for this operation.

ResourceNotFoundException

  • ResourceNotFoundException object: The specified resource could not be found.

RoleARN

  • RoleARN string

S3BackupMode

  • S3BackupMode string (values: Disabled, Enabled)

S3DestinationConfiguration

S3DestinationDescription

S3DestinationUpdate

ServiceUnavailableException

  • ServiceUnavailableException object: The service is unavailable, back off and retry the operation. If you continue to see the exception, throughput limits for the delivery stream may have been exceeded. For more information about limits and how to request an increase, see Amazon Kinesis Firehose Limits.

SizeInMBs

  • SizeInMBs integer

SourceDescription

  • SourceDescription object: Details about a Kinesis stream used as the source for a Kinesis Firehose delivery stream.

SplunkDestinationConfiguration

SplunkDestinationDescription

SplunkDestinationUpdate

SplunkRetryDurationInSeconds

  • SplunkRetryDurationInSeconds integer

SplunkRetryOptions

  • SplunkRetryOptions object: Configures retry behavior in case Kinesis Firehose is unable to deliver documents to Splunk or if it doesn't receive an acknowledgment from Splunk.

SplunkS3BackupMode

  • SplunkS3BackupMode string (values: FailedEventsOnly, AllEvents)

Timestamp

  • Timestamp string

UpdateDestinationInput

UpdateDestinationOutput

  • UpdateDestinationOutput object

Username

  • Username string