This website uses Cookies. Click Accept to agree to our website's cookie use as described in our Privacy Policy. Click Preferences to customize your cookie settings.
I have a simple Java program that reads messages from a Google Cloud Pubsub topic and prints them. It works correctly when I run the program as a standalone, but it fails to receive messages when run using spark-submit.
Do programs launched using spark-submit have to follow a different structure in general? My program doesn't do any Spark related stuff as of now, but I'll be adding it later.
Are there working examples of Spark + Cloud Pubsub integration? I came across a library called Apache Bahir, but is it a must to use a library like that?