Integrate with Central

Data producers looking to integrate with Central have two primary methods: generating data via the Collector Endpoint or directly to Kafka.

Generate data through the Collector endpoint

  1. Create a Feature Request (FR) card in the Central board

    Submit an FR card to track the onboarding process. This card will serve as the central point for communication and documentation of all onboarding tasks.

  2. Perform VPC peering with Edge VPC

    1. Establish VPC peering with the Edge VPC to enable communication between your VPC and the Edge VPC, which hosts Central's Collector Endpoint.
    2. Update the status of this step in the FR card.

    Details for the Edge VPC, such as the VPC ID and CIDR range, vary by region and can be found in the sources. For instance:

    • Region: me-central-1
    • VPC ID: vpc-0d45c3f9556f5f4db
    • CIDR Range: 10.170.128.0/17
  3. Perform VPC association with Edge’s Route53 hosted zone

    1. After VPC peering, associate your VPC with Edge's Route53 Hosted Zone to ensure proper DNS resolution for Central's endpoints within your VPC.
    2. Update the status in the FR card.

    Route53 Hosted Zone IDs also vary by region. For example:

    • Region: me-central-1
    • Hosted Zone ID: Z0203536MG8D10RIUDIR
  4. Create a service using the DevOps Catalog

    1. Create a service through the DevOps Catalog.

      The service token provided will be used for authorization when sending data to the Collector Endpoint.

    2. Attach the DevOps ticket to the FR card.

  5. Create a Kafka topic

    1. Request the creation of a Kafka topic to store your data.
    2. Attach the DevOps ticket to the FR card.
  6. Create payload Type-to-Topic mapping

    1. Configure the mapping between your data payload type and the Kafka topic. This ensures the Central Collector routes your data to the appropriate topic based on the payload type.
    2. Attach the DevOps ticket for this step to the FR card.
  7. Optional: Create route mapping

    1. If needed, create a route mapping for further customization of data routing within Central. Refer to Section 3, "Configuring the Mapping with Central," for more details.
    2. Attach the DevOps ticket to the FR card if route mapping is implemented.
  8. Test the integration

    1. Test the setup by sending a sample payload to the Collector Endpoint: Endpoint: http://central..edge/collector
    2. Verify that the payload is routed correctly to the intended Kafka topic.

Generate directly to Kafka

  1. Create a Feature Request (FR) card in the Central board

    Submit an FR card to track and document your onboarding process. This card serves as the central point of communication and progress tracking.

  2. Perform VPC Peering with Central Kafka VPC

    1. Establish VPC peering between your VPC and the Central Kafka VPC to enable direct communication for data publishing.

    2. Update the VPC peering status in your FR card once completed.

      Central Kafka VPC details vary by region. For example, in the me-central-1 region:

      • VPC ID: vpc-027d37ee431e606c3
      • CIDR Range: 10.166.88.0/22
  3. Perform VPC association with Central’s Route53 hosted zone

    1. After VPC peering, associate your VPC with Central's Route53 hosted zone to ensure correct DNS resolution for Kafka within your VPC.

    2. Update your FR card with the status once completed. This process involves two steps:

      1. The Central team initiates a VPC authorization request using the AWS CLI command:

        aws route53 create-vpc-association-authorization --hosted-zone-id <HOSTED_ZONE_ID> --vpc VPCRegion=<REGION>,VPCId=<VPC_ID>
      2. Your team associates your VPC with Central’s hosted zone using the command:

          aws route53 associate-vpc-with-hosted-zone --hosted-zone-id <HOSTED_ZONE_ID> --vpc VPCRegion=<REGION>,VPCId=<VPC_ID>

        Hosted Zone IDs differ by region. For example, in the me-central-1 region, the Hosted Zone ID can be Z02001362JD21N1YA3K2N

  4. Create a topic

    1. Request the creation of a Kafka topic to store your data.
    2. Attach the corresponding DevOps ticket to your FR card for tracking.
  5. Obtain SASL Credentials

    The Central team will generate and share SASL credentials through the FR card. These credentials are required for authentication when publishing data directly to Kafka.

  6. Whitelist SASL credentials

    1. The Central team will whitelist the SASL credentials, granting your producer write access to the specified Kafka topic.
    2. Update the whitelist status in your FR card once completed.
  7. Test

    1. Test your Kafka producer clients using the provided SASL credentials and Kafka domains. Kafka domains vary by region. For example, in the me-central-1 region, the domains are:
      • kafka-1.uae.kafka
      • kafka-2.uae.kafka
      • kafka-3.uae.kafka
    2. Update the testing status in your FR card after successful validation.

For both onboarding methods, effective communication with the Central team and maintaining transparency through the FR card are crucial. Regularly updating the FR card with the status of each step ensures smooth progress tracking and helps address any potential issues during the onboarding process.

Additonal resources

The provided details primarily focus on onboarding as a data producer. For more comprehensive information on consuming data from Central, please reach out to the Central team directly.

You can contact them via email at central-team@freshworks.com or through their Slack channels: #fp-central-help and #fp-central-production-oncall.