Azure Data Factory - Delta Format Vaccum error

b_d21 1 Reputation point
2020-07-20T05:52:08.297+00:00

Getting this error while executing ADF using delta format ,Dont know where to set this property ?


Operation on target dataflow1 failed: {"StatusCode":"DFExecutorUserError","Message":"at Sink 'sink1': java.lang.IllegalArgumentException: requirement failed: Are you sure you would like to vacuum files with such a low retention period? If you have\nwriters that are currently writing to this table, there is a risk that you may corrupt the\nstate of your Delta table.\n\nIf you are certain that there are no operations being performed on this table, such as\ninsert/upsert/delete/optimize, then you may turn off this check by setting:\nspark.databricks.delta.retentionDurationCheck.enabled = false\n\nIf you are not sure, please use a value not less than \"168 hours\".\n ","Details":"at Sink 'sink1': java.lang.IllegalArgumentException: requirement failed: Are you sure you would like to vacuum files with such a low retention period? If you have\nwriters that are currently writing to this table, there is a risk that you may corrupt the\nstate of your Delta table.\n\nIf you are certain that there are no operations being performed on this table, such as\ninsert/upsert/delete/optimize, then you may turn off this check by setting:\nspark.databricks.delta.retentionDurationCheck.enabled = false\n\nIf you are not sure, please use a value not less than \"168 hours\".\n "}

Operation on target dataflow1 failed: {"StatusCode":"DFExecutorUserError","Message":"at Sink 'sink1': java.lang.IllegalArgumentException: requirement failed: Are you sure you would like to vacuum files with such a low retention period? If you have\nwriters that are currently writing to this table, there is a risk that you may corrupt the\nstate of your Delta table.\n\nIf you are certain that there are no operations being performed on this table, such as\ninsert/upsert/delete/optimize, then you may turn off this check by setting:\nspark.databricks.delta.retentionDurationCheck.enabled = false\n\nIf you are not sure, please use a value not less than \"168 hours\".\n ","Details":"at Sink 'sink1': java.lang.IllegalArgumentException: requirement failed: Are you sure you would like to vacuum files with such a low retention period? If you have\nwriters that are currently writing to this table, there is a risk that you may corrupt the\nstate of your Delta table.\n\nIf you are certain that there are no operations being performed on this table, such as\ninsert/upsert/delete/optimize, then you may turn off this check by setting:\nspark.databricks.delta.retentionDurationCheck.enabled = false\n\nIf you are not sure, please use a value not less than \"168 hours\".\n "}

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,518 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Jack Ma 161 Reputation points
    2020-07-21T03:05:52.577+00:00

    Hi @bd21-6457 ,

    This issue should be fixed now. Could you help to try again and let us know if you still see the error or not?

    Thanks

    0 comments No comments

  2. KranthiPakala-MSFT 46,462 Reputation points Microsoft Employee
    2020-07-21T07:08:09.577+00:00

    Hi @b_d21 ,

    Sorry for your experience. This was identified as a bug by ADF engineering team and a fix was already deployed and you should no longer see this error message. Could you please try again and let us know if you still experience the issue?

    ----------

    Thank you
    Please do consider to click on "Accept Answer" and "Upvote" on the post that helps you, as it can be beneficial to other community members.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.