2 years ago

#74935

test-img

Shankar

terraform databricks_job not starting every time when the job name is same

terraform is not running databricks_job.

If i delete the job and run terraform, it is creating a job/starting. Is it because of same Job name? we wanted to trigger this job from Airflow, so we wanted to keep the job name static.

Also, the terraform apply doesn't show the log for creating job.

Update: The reason for using existing cluster is because , we wanted to use the same cluster for multiple spark jobs in sequence.

sample code:

resource "databricks_job" "db_job" {
  name           = "Sample Job"
  always_running = true

  existing_cluster_id = databricks_cluster.db_cluster.id

  library {
    jar = "dbfs:${local.dbfs_jar_file_path}"
  }

  spark_jar_task {
    main_class_name = "x.x.x.SomeApp"
    parameters = [
      "-Dconfig.file=/dbfs/config.conf"
    ]
  }
}

terraform

databricks

terraform-provider-databricks

0 Answers

Your Answer

Accepted video resources