Managed Service for Apache Kafka API: Node.js Client

release levelnpm version

Apache Kafka for BigQuery API client for Node.js

A comprehensive list of changes in each version may be found in the CHANGELOG.

Read more about the client libraries for Cloud APIs, including the older Google APIs Client Libraries, in Client Libraries Explained.

Table of contents:

Quickstart

Before you begin

  1. Select or create a Cloud Platform project.
  2. Enable billing for your project.
  3. Enable the Managed Service for Apache Kafka API API.
  4. Set up authentication so you can access the API from your local workstation.

Installing the client library

npm install @google-cloud/managedkafka

Using the client library

/**
 * This snippet has been automatically generated and should be regarded as a code template only.
 * It will require modifications to work.
 * It may require correct/in-range values for request initialization.
 * TODO(developer): Uncomment these variables before running the sample.
 */
/**
 *  Required. The parent location whose clusters are to be listed. Structured
 *  like `projects/{project}/locations/{location}`.
 */
// const parent = 'abc123'
/**
 *  Optional. The maximum number of clusters to return. The service may return
 *  fewer than this value. If unspecified, server will pick an appropriate
 *  default.
 */
// const pageSize = 1234
/**
 *  Optional. A page token, received from a previous `ListClusters` call.
 *  Provide this to retrieve the subsequent page.
 *  When paginating, all other parameters provided to `ListClusters` must match
 *  the call that provided the page token.
 */
// const pageToken = 'abc123'
/**
 *  Optional. Filter expression for the result.
 */
// const filter = 'abc123'
/**
 *  Optional. Order by fields for the result.
 */
// const orderBy = 'abc123'

// Imports the Managedkafka library
const {ManagedKafkaClient} = require('@google-cloud/managedkafka').v1;

// Instantiates a client
const managedkafkaClient = new ManagedKafkaClient();

async function callListClusters() {
  // Construct request
  const request = {
    parent,
  };

  // Run request
  const iterable = managedkafkaClient.listClustersAsync(request);
  for await (const response of iterable) {
    console.log(response);
  }
}

callListClusters();

Samples

Samples are in the samples/ directory. Each sample's README.md has instructions for running its sample.

SampleSource CodeTry it
Managed_kafka.add_acl_entrysource codeOpen in Cloud Shell
Managed_kafka.create_aclsource codeOpen in Cloud Shell
Managed_kafka.create_clustersource codeOpen in Cloud Shell
Managed_kafka.create_topicsource codeOpen in Cloud Shell
Managed_kafka.delete_aclsource codeOpen in Cloud Shell
Managed_kafka.delete_clustersource codeOpen in Cloud Shell
Managed_kafka.delete_consumer_groupsource codeOpen in Cloud Shell
Managed_kafka.delete_topicsource codeOpen in Cloud Shell
Managed_kafka.get_aclsource codeOpen in Cloud Shell
Managed_kafka.get_clustersource codeOpen in Cloud Shell
Managed_kafka.get_consumer_groupsource codeOpen in Cloud Shell
Managed_kafka.get_topicsource codeOpen in Cloud Shell
Managed_kafka.list_aclssource codeOpen in Cloud Shell
Managed_kafka.list_clusterssource codeOpen in Cloud Shell
Managed_kafka.list_consumer_groupssource codeOpen in Cloud Shell
Managed_kafka.list_topicssource codeOpen in Cloud Shell
Managed_kafka.remove_acl_entrysource codeOpen in Cloud Shell
Managed_kafka.update_aclsource codeOpen in Cloud Shell
Managed_kafka.update_clustersource codeOpen in Cloud Shell
Managed_kafka.update_consumer_groupsource codeOpen in Cloud Shell
Managed_kafka.update_topicsource codeOpen in Cloud Shell
Managed_kafka_connect.create_connect_clustersource codeOpen in Cloud Shell
Managed_kafka_connect.create_connectorsource codeOpen in Cloud Shell
Managed_kafka_connect.delete_connect_clustersource codeOpen in Cloud Shell
Managed_kafka_connect.delete_connectorsource codeOpen in Cloud Shell
Managed_kafka_connect.get_connect_clustersource codeOpen in Cloud Shell
Managed_kafka_connect.get_connectorsource codeOpen in Cloud Shell
Managed_kafka_connect.list_connect_clusterssource codeOpen in Cloud Shell
Managed_kafka_connect.list_connectorssource codeOpen in Cloud Shell
Managed_kafka_connect.pause_connectorsource codeOpen in Cloud Shell
Managed_kafka_connect.restart_connectorsource codeOpen in Cloud Shell
Managed_kafka_connect.resume_connectorsource codeOpen in Cloud Shell
Managed_kafka_connect.stop_connectorsource codeOpen in Cloud Shell
Managed_kafka_connect.update_connect_clustersource codeOpen in Cloud Shell
Managed_kafka_connect.update_connectorsource codeOpen in Cloud Shell
Quickstartsource codeOpen in Cloud Shell

The Managed Service for Apache Kafka API Node.js Client API Reference documentation also contains samples.

Supported Node.js Versions

Our client libraries follow the Node.js release schedule. Libraries are compatible with all current active and maintenance versions of Node.js. If you are using an end-of-life version of Node.js, we recommend that you update as soon as possible to an actively supported LTS version.

Google's client libraries support legacy versions of Node.js runtimes on a best-efforts basis with the following warnings:

  • Legacy versions are not tested in continuous integration.
  • Some security es and features cannot be backported.
  • Dependencies cannot be kept up-to-date.

Client libraries targeting some end-of-life versions of Node.js are available, and can be installed through npm dist-tags. The dist-tags follow the naming convention legacy-(version). For example, npm install @google-cloud/managedkafka@legacy-8 installs client libraries for versions compatible with Node.js 8.

Versioning

This library follows Semantic Versioning.

This library is considered to be in preview. This means it is still a work-in-progress and under active development. Any release is subject to backwards-incompatible changes at any time.

More Information: Google Cloud Platform Launch Stages

Contributing

Contributions welcome! See the Contributing Guide.

Please note that this README.md, the samples/README.md, and a variety of configuration files in this repository (including .nycrc and tsconfig.json) are generated from a central template. To edit one of these files, make an edit to its templates in directory.

License

Apache Version 2.0

See LICENSE