Python Client with AWS MSK and Confluent Schema Registry and AVRO message payload

Albert Wong
1 min readNov 13, 2024

--

Download the official Python Client at https://github.com/confluentinc/confluent-kafka-python/blob/master/examples/avro_producer.py.

  1. Create a new EC2 for the python client.
  2. Open up the MSK security group so that your EC2 client can connect to MSK. EC2 security should also be able to access Confluent Cloud.
  3. Change the schema registry config so that you can authenticate to Confluent Schema Registry (need an Confluent Cloud account)
schema_registry_conf = {'url': args.schema_registry, 'basic.auth.user.info':'5IXZIJATDXZ7PMEQ:IYPo7G7XJwz/AhMSeCTWZA+Yq/Rpsn'}

4. Change the kafka msk config. Of the options, you don’t want to use IAM because that ties you to AWS. Picking the SASL option for MSK, you have to use SASL_SSL and also SCRAM-SHA-512. To create the username and password for SASL, see my other post.

producer_conf = {'bootstrap.servers': args.bootstrap_servers, 'sasl.username': 'admin', 'sasl.password': 'admin', 'sasl.mechanism': 'SCRAM-SHA-512','security.protocol':'SASL_SSL'}

5. Run the avro_producer.py

python avro_producer.py -t albert -s https://psrc-lq2dm.us-east-2.aws.confluent.cloud -b b-1.onehouse.5tpo0i.c8.kafka.us-west-2.amazonaws.com:9096,b-2.onehouse.5tpo0i.c8.kafka.us-west-2.amazonaws.com:9096,b-3.onehouse.5tpo0i.c8.kafka.us-west-2.amazonaws.com:9096

6. Make similar updates to schema registry and kafka config on the consumer python script to get consumer to work.

--

--

Albert Wong
Albert Wong

Written by Albert Wong

#eCommerce #Java #Database #k8s #Automation. Hobbies: #BoardGames #Comics #Skeet #VideoGames #Pinball #Magic #YelpElite #Travel #Candy

No responses yet